Mar 07 06:51:41 crc systemd[1]: Starting Kubernetes Kubelet... Mar 07 06:51:41 crc restorecon[4811]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 07 06:51:41 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:51:42 crc restorecon[4811]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:51:42 crc restorecon[4811]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 07 06:51:43 crc kubenswrapper[4941]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 06:51:43 crc kubenswrapper[4941]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 07 06:51:43 crc kubenswrapper[4941]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 06:51:43 crc kubenswrapper[4941]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 06:51:43 crc kubenswrapper[4941]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 07 06:51:43 crc kubenswrapper[4941]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.718958 4941 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.725842 4941 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726057 4941 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726066 4941 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726072 4941 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726077 4941 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726080 4941 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726084 4941 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726088 4941 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726093 4941 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726097 4941 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726100 4941 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726104 4941 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726108 4941 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726111 4941 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726115 4941 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726118 4941 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726122 4941 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726126 4941 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726129 4941 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726133 4941 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726137 4941 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726140 4941 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726144 4941 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726147 4941 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726151 4941 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726155 4941 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726159 4941 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726162 4941 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726165 4941 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726169 4941 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726172 4941 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726176 4941 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726179 4941 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726184 4941 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726188 4941 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726193 4941 feature_gate.go:330] unrecognized feature gate: Example Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726196 4941 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726201 4941 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726206 4941 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726212 4941 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726217 4941 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726221 4941 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726226 4941 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726230 4941 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726234 4941 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726240 4941 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726244 4941 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726248 4941 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726251 4941 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726255 4941 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726259 4941 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726262 4941 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726266 4941 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726270 4941 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726275 4941 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726279 4941 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726283 4941 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726286 4941 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726290 4941 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726293 4941 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726297 4941 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726300 4941 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726303 4941 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726307 4941 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726312 4941 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726317 4941 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726321 4941 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726324 4941 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726328 4941 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726331 4941 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.726335 4941 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726461 4941 flags.go:64] FLAG: --address="0.0.0.0" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726472 4941 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726482 4941 flags.go:64] FLAG: --anonymous-auth="true" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726487 4941 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726493 4941 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726498 4941 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726504 4941 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726510 4941 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726514 4941 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726519 4941 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726524 4941 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726528 4941 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726534 4941 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726538 4941 flags.go:64] FLAG: --cgroup-root="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726542 4941 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726547 4941 flags.go:64] FLAG: --client-ca-file="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726551 4941 flags.go:64] FLAG: --cloud-config="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726555 4941 flags.go:64] FLAG: --cloud-provider="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726559 4941 flags.go:64] FLAG: --cluster-dns="[]" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726566 4941 flags.go:64] FLAG: --cluster-domain="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726570 4941 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726574 4941 flags.go:64] FLAG: --config-dir="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726578 4941 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726583 4941 flags.go:64] FLAG: --container-log-max-files="5" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726589 4941 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726593 4941 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726598 4941 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726602 4941 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726607 4941 flags.go:64] FLAG: --contention-profiling="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726611 4941 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726615 4941 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726619 4941 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726625 4941 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726631 4941 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726635 4941 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726639 4941 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726643 4941 flags.go:64] FLAG: --enable-load-reader="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726648 4941 flags.go:64] FLAG: --enable-server="true" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726652 4941 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726658 4941 flags.go:64] FLAG: --event-burst="100" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726662 4941 flags.go:64] FLAG: --event-qps="50" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726666 4941 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726671 4941 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726676 4941 flags.go:64] FLAG: --eviction-hard="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726681 4941 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726685 4941 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726689 4941 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726694 4941 flags.go:64] FLAG: --eviction-soft="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726698 4941 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726702 4941 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726706 4941 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726710 4941 flags.go:64] FLAG: --experimental-mounter-path="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726715 4941 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726721 4941 flags.go:64] FLAG: --fail-swap-on="true" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726725 4941 flags.go:64] FLAG: --feature-gates="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726730 4941 flags.go:64] FLAG: --file-check-frequency="20s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726734 4941 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726739 4941 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726743 4941 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726748 4941 flags.go:64] FLAG: --healthz-port="10248" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726752 4941 flags.go:64] FLAG: --help="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726756 4941 flags.go:64] FLAG: --hostname-override="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726760 4941 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726765 4941 flags.go:64] FLAG: --http-check-frequency="20s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726769 4941 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726774 4941 flags.go:64] FLAG: --image-credential-provider-config="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726778 4941 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726782 4941 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726786 4941 flags.go:64] FLAG: --image-service-endpoint="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726790 4941 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726795 4941 flags.go:64] FLAG: --kube-api-burst="100" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726799 4941 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726803 4941 flags.go:64] FLAG: --kube-api-qps="50" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726807 4941 flags.go:64] FLAG: --kube-reserved="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726811 4941 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726815 4941 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726819 4941 flags.go:64] FLAG: --kubelet-cgroups="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726823 4941 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726828 4941 flags.go:64] FLAG: --lock-file="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726832 4941 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726836 4941 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726840 4941 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726847 4941 flags.go:64] FLAG: --log-json-split-stream="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726852 4941 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726856 4941 flags.go:64] FLAG: --log-text-split-stream="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726861 4941 flags.go:64] FLAG: --logging-format="text" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726865 4941 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726870 4941 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726875 4941 flags.go:64] FLAG: --manifest-url="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726879 4941 flags.go:64] FLAG: --manifest-url-header="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726885 4941 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726889 4941 flags.go:64] FLAG: --max-open-files="1000000" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726894 4941 flags.go:64] FLAG: --max-pods="110" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726898 4941 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726903 4941 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726907 4941 flags.go:64] FLAG: --memory-manager-policy="None" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726911 4941 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726916 4941 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726920 4941 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726924 4941 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726939 4941 flags.go:64] FLAG: --node-status-max-images="50" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726944 4941 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726948 4941 flags.go:64] FLAG: --oom-score-adj="-999" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726952 4941 flags.go:64] FLAG: --pod-cidr="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726956 4941 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726965 4941 flags.go:64] FLAG: --pod-manifest-path="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726969 4941 flags.go:64] FLAG: --pod-max-pids="-1" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726973 4941 flags.go:64] FLAG: --pods-per-core="0" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726977 4941 flags.go:64] FLAG: --port="10250" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726982 4941 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726986 4941 flags.go:64] FLAG: --provider-id="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726990 4941 flags.go:64] FLAG: --qos-reserved="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726994 4941 flags.go:64] FLAG: --read-only-port="10255" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.726999 4941 flags.go:64] FLAG: --register-node="true" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727003 4941 flags.go:64] FLAG: --register-schedulable="true" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727007 4941 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727014 4941 flags.go:64] FLAG: --registry-burst="10" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727018 4941 flags.go:64] FLAG: --registry-qps="5" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727022 4941 flags.go:64] FLAG: --reserved-cpus="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727027 4941 flags.go:64] FLAG: --reserved-memory="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727032 4941 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727036 4941 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727041 4941 flags.go:64] FLAG: --rotate-certificates="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727045 4941 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727049 4941 flags.go:64] FLAG: --runonce="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727053 4941 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727058 4941 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727062 4941 flags.go:64] FLAG: --seccomp-default="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727066 4941 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727071 4941 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727076 4941 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727080 4941 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727085 4941 flags.go:64] FLAG: --storage-driver-password="root" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727089 4941 flags.go:64] FLAG: --storage-driver-secure="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727093 4941 flags.go:64] FLAG: --storage-driver-table="stats" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727097 4941 flags.go:64] FLAG: --storage-driver-user="root" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727101 4941 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727106 4941 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727110 4941 flags.go:64] FLAG: --system-cgroups="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727114 4941 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727122 4941 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727126 4941 flags.go:64] FLAG: --tls-cert-file="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727130 4941 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727136 4941 flags.go:64] FLAG: --tls-min-version="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727140 4941 flags.go:64] FLAG: --tls-private-key-file="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727145 4941 flags.go:64] FLAG: --topology-manager-policy="none" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727149 4941 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727154 4941 flags.go:64] FLAG: --topology-manager-scope="container" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727158 4941 flags.go:64] FLAG: --v="2" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727164 4941 flags.go:64] FLAG: --version="false" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727169 4941 flags.go:64] FLAG: --vmodule="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727175 4941 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727179 4941 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727298 4941 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727304 4941 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727308 4941 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727313 4941 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727316 4941 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727320 4941 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727324 4941 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727329 4941 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727333 4941 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727338 4941 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727341 4941 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727345 4941 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727349 4941 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727353 4941 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727357 4941 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727361 4941 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727365 4941 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727368 4941 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727372 4941 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727376 4941 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727379 4941 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727383 4941 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727386 4941 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727390 4941 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727393 4941 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727396 4941 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727403 4941 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727407 4941 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727412 4941 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727416 4941 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727419 4941 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727439 4941 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727442 4941 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727446 4941 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727450 4941 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727453 4941 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727457 4941 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727460 4941 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727464 4941 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727467 4941 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727471 4941 feature_gate.go:330] unrecognized feature gate: Example Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727475 4941 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727478 4941 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727487 4941 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727490 4941 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727494 4941 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727497 4941 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727500 4941 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727505 4941 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727510 4941 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727514 4941 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727519 4941 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727522 4941 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727525 4941 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727529 4941 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727532 4941 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727536 4941 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727539 4941 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727543 4941 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727546 4941 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727550 4941 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727553 4941 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727556 4941 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727560 4941 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727564 4941 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727567 4941 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727571 4941 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727574 4941 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727577 4941 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727581 4941 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.727584 4941 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.727597 4941 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.737876 4941 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.737946 4941 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738029 4941 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738039 4941 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738044 4941 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738051 4941 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738057 4941 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738063 4941 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738067 4941 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738071 4941 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738075 4941 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738082 4941 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738085 4941 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738089 4941 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738132 4941 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738137 4941 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738142 4941 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738147 4941 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738152 4941 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738156 4941 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738202 4941 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738205 4941 feature_gate.go:330] unrecognized feature gate: Example Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738210 4941 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738215 4941 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738220 4941 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738225 4941 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738231 4941 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738238 4941 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738244 4941 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738250 4941 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738255 4941 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738261 4941 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738285 4941 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738290 4941 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738294 4941 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738298 4941 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738312 4941 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738318 4941 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738327 4941 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738331 4941 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738335 4941 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738339 4941 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738343 4941 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738347 4941 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738351 4941 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738355 4941 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738360 4941 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738365 4941 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738369 4941 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738374 4941 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738380 4941 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738385 4941 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738389 4941 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738393 4941 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738397 4941 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738401 4941 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738406 4941 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738410 4941 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738413 4941 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738419 4941 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738439 4941 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738443 4941 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738446 4941 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738450 4941 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738453 4941 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738457 4941 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738461 4941 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738465 4941 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738469 4941 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738473 4941 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738476 4941 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738480 4941 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738491 4941 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.738500 4941 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738670 4941 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738678 4941 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738682 4941 feature_gate.go:330] unrecognized feature gate: Example Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738687 4941 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738691 4941 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738695 4941 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738698 4941 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738702 4941 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738706 4941 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738710 4941 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738714 4941 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738718 4941 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738722 4941 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738729 4941 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738733 4941 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738738 4941 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738741 4941 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738745 4941 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738749 4941 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738759 4941 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738763 4941 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738768 4941 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738772 4941 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738775 4941 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738780 4941 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738785 4941 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738790 4941 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738794 4941 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738798 4941 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738802 4941 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738806 4941 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738809 4941 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738813 4941 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738816 4941 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738827 4941 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738830 4941 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738834 4941 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738838 4941 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738841 4941 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738844 4941 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738848 4941 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738852 4941 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738857 4941 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738861 4941 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738867 4941 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738872 4941 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738875 4941 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738879 4941 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738883 4941 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738887 4941 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738890 4941 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738894 4941 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738899 4941 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738903 4941 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738907 4941 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738911 4941 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738915 4941 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738920 4941 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738924 4941 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738928 4941 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738932 4941 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738936 4941 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738939 4941 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738943 4941 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738946 4941 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738950 4941 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738953 4941 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738957 4941 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738960 4941 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738964 4941 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.738974 4941 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.738981 4941 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.739244 4941 server.go:940] "Client rotation is on, will bootstrap in background" Mar 07 06:51:43 crc kubenswrapper[4941]: E0307 06:51:43.745412 4941 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.752793 4941 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.752956 4941 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.754949 4941 server.go:997] "Starting client certificate rotation" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.755015 4941 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.755268 4941 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.787393 4941 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 06:51:43 crc kubenswrapper[4941]: E0307 06:51:43.789035 4941 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.790653 4941 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.808060 4941 log.go:25] "Validated CRI v1 runtime API" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.850800 4941 log.go:25] "Validated CRI v1 image API" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.852274 4941 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.858238 4941 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-07-06-41-02-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.858270 4941 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:29 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.878992 4941 manager.go:217] Machine: {Timestamp:2026-03-07 06:51:43.875788186 +0000 UTC m=+0.828153691 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d663d73a-c5af-4e4f-81fe-9bf574386cbb BootID:0215de21-ea41-473f-b375-94c4974c5b21 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:29 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:65:a8:b1 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:65:a8:b1 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:75:d4:8b Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:74:10:cf Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:cb:af:64 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:b5:64:c9 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:7c:fa:79 Speed:-1 Mtu:1496} {Name:ens7.44 MacAddress:52:54:00:3e:ed:cf Speed:-1 Mtu:1496} {Name:eth10 MacAddress:16:95:c0:55:d6:0a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8e:86:0b:0b:a9:af Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.879346 4941 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.879586 4941 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.879920 4941 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.880134 4941 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.880179 4941 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.880449 4941 topology_manager.go:138] "Creating topology manager with none policy" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.880464 4941 container_manager_linux.go:303] "Creating device plugin manager" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.881031 4941 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.881070 4941 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.881312 4941 state_mem.go:36] "Initialized new in-memory state store" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.881457 4941 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.884908 4941 kubelet.go:418] "Attempting to sync node with API server" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.884937 4941 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.884957 4941 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.884974 4941 kubelet.go:324] "Adding apiserver pod source" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.885001 4941 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.889206 4941 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.890517 4941 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.891348 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:43 crc kubenswrapper[4941]: E0307 06:51:43.891476 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.891559 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:43 crc kubenswrapper[4941]: E0307 06:51:43.891688 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.891916 4941 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.893562 4941 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.893590 4941 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.893598 4941 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.893606 4941 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.893619 4941 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.893627 4941 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.893635 4941 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.893650 4941 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.893684 4941 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.893692 4941 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.893714 4941 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.893722 4941 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.895672 4941 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.896199 4941 server.go:1280] "Started kubelet" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.897339 4941 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.897332 4941 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.897709 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:43 crc systemd[1]: Started Kubernetes Kubelet. Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.899181 4941 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.904800 4941 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.904855 4941 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.905513 4941 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.905558 4941 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.905585 4941 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 07 06:51:43 crc kubenswrapper[4941]: E0307 06:51:43.905551 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.906540 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:43 crc kubenswrapper[4941]: E0307 06:51:43.906638 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.907882 4941 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.907910 4941 factory.go:55] Registering systemd factory Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.907921 4941 factory.go:221] Registration of the systemd container factory successfully Mar 07 06:51:43 crc kubenswrapper[4941]: E0307 06:51:43.907994 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="200ms" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.908215 4941 factory.go:153] Registering CRI-O factory Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.908250 4941 factory.go:221] Registration of the crio container factory successfully Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.908282 4941 factory.go:103] Registering Raw factory Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.908306 4941 manager.go:1196] Started watching for new ooms in manager Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.908846 4941 server.go:460] "Adding debug handlers to kubelet server" Mar 07 06:51:43 crc kubenswrapper[4941]: E0307 06:51:43.906895 4941 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189a7c82a5fda0c7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.896166599 +0000 UTC m=+0.848532064,LastTimestamp:2026-03-07 06:51:43.896166599 +0000 UTC m=+0.848532064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.909108 4941 manager.go:319] Starting recovery of all containers Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920198 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920249 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920261 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920270 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920279 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920289 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920298 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920308 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920318 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920327 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920335 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920343 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920352 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920363 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920371 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920381 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920392 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920404 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920480 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920490 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920499 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920508 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920516 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920526 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920535 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920543 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920556 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920567 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920576 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920586 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920595 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920606 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920616 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920626 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920637 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920647 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920659 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920672 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920686 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920698 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920711 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920722 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920735 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920747 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920759 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920771 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920783 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920799 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920812 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920824 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920836 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920848 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920864 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920877 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920890 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920903 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920914 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920927 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920938 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920950 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920963 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920975 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920986 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.920998 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921038 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921051 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921063 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921075 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921087 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921099 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921112 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921126 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921138 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921152 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921163 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921174 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921187 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921197 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921209 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921222 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921233 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921244 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921256 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921266 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921277 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921288 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921297 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921313 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921323 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921336 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921348 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921358 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921371 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921383 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921395 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921413 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921442 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921456 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921468 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921479 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921493 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921505 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921517 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921529 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921547 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921560 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921573 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921586 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921599 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921611 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921623 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921638 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921651 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921663 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921678 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921689 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921700 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921713 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921724 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921738 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921750 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921762 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921775 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.921789 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.923916 4941 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.923944 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.923958 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.923974 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.923987 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924001 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924013 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924032 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924044 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924056 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924067 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924077 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924091 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924103 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924115 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924126 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924137 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924148 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924159 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924172 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924183 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924197 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924207 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924218 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924232 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924243 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924255 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924268 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924281 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924295 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924309 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924322 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924334 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924347 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924361 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924376 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924389 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924407 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924433 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924449 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924460 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924471 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924483 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924495 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924508 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924519 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924532 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924544 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924556 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924569 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924584 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924598 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924611 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924624 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924636 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924648 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924674 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924687 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924698 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924711 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924722 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924773 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924785 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924797 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924809 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924822 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924836 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924850 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924862 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924877 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924888 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924900 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924913 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924926 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924938 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924951 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924962 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924974 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924986 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.924998 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.925009 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.925021 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.925033 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.925047 4941 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.925058 4941 reconstruct.go:97] "Volume reconstruction finished" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.925066 4941 reconciler.go:26] "Reconciler: start to sync state" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.934369 4941 manager.go:324] Recovery completed Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.949613 4941 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.949803 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.951950 4941 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.952110 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.952137 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.952161 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.953184 4941 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.953298 4941 kubelet.go:2335] "Starting kubelet main sync loop" Mar 07 06:51:43 crc kubenswrapper[4941]: E0307 06:51:43.953385 4941 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.954416 4941 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.954459 4941 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.954482 4941 state_mem.go:36] "Initialized new in-memory state store" Mar 07 06:51:43 crc kubenswrapper[4941]: W0307 06:51:43.956289 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:43 crc kubenswrapper[4941]: E0307 06:51:43.956362 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.985469 4941 policy_none.go:49] "None policy: Start" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.986481 4941 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 07 06:51:43 crc kubenswrapper[4941]: I0307 06:51:43.986540 4941 state_mem.go:35] "Initializing new in-memory state store" Mar 07 06:51:44 crc kubenswrapper[4941]: E0307 06:51:44.006168 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.040290 4941 manager.go:334] "Starting Device Plugin manager" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.040519 4941 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.040621 4941 server.go:79] "Starting device plugin registration server" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.041259 4941 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.041348 4941 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.041660 4941 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.041915 4941 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.041933 4941 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 07 06:51:44 crc kubenswrapper[4941]: E0307 06:51:44.053247 4941 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.054442 4941 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.054547 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.055469 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.055500 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.055508 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.055622 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.055872 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.055941 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.056493 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.056512 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.056520 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.056627 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.056781 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.056829 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.056986 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.057045 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.057081 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.057111 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.057126 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.057134 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.057215 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.057367 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.057390 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.057532 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.057559 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.057588 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.058141 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.058157 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.058165 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.058176 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.058189 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.058177 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.058287 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.058452 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.058480 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.058925 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.058955 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.058965 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.059130 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.059151 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.059273 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.059293 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.059302 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.059872 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.059891 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.059899 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:44 crc kubenswrapper[4941]: E0307 06:51:44.109003 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="400ms" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.126826 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.126871 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.126899 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.126924 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.126945 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.126967 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.126988 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.127012 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.127066 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.127086 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.127106 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.127127 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.127146 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.127166 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.127186 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.142109 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.143569 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.143609 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.143622 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.143648 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:51:44 crc kubenswrapper[4941]: E0307 06:51:44.144725 4941 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228427 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228505 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228529 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228545 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228560 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228575 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228592 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228607 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228622 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228660 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228675 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228666 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228733 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228692 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228774 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228803 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228822 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228823 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228838 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228856 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228878 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228900 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228922 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228944 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228964 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228985 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.228804 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.229012 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.229031 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.229051 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.345701 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.347538 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.347589 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.347600 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.347628 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:51:44 crc kubenswrapper[4941]: E0307 06:51:44.348229 4941 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.384860 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.390582 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.405641 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.424045 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: W0307 06:51:44.430017 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-d7396491129e746d70a8e07c23b1f3006df76dc016881c844fcaa2744bbbf6b1 WatchSource:0}: Error finding container d7396491129e746d70a8e07c23b1f3006df76dc016881c844fcaa2744bbbf6b1: Status 404 returned error can't find the container with id d7396491129e746d70a8e07c23b1f3006df76dc016881c844fcaa2744bbbf6b1 Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.430515 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:51:44 crc kubenswrapper[4941]: W0307 06:51:44.431100 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d2cb6f605e48a20785e109abb9eb1062cecf51e697a51ddbed1ff966340e1e96 WatchSource:0}: Error finding container d2cb6f605e48a20785e109abb9eb1062cecf51e697a51ddbed1ff966340e1e96: Status 404 returned error can't find the container with id d2cb6f605e48a20785e109abb9eb1062cecf51e697a51ddbed1ff966340e1e96 Mar 07 06:51:44 crc kubenswrapper[4941]: W0307 06:51:44.436827 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-9672a034880057ddddebbd8d5025c2f5ff392ad10353c79a2dc95fc0552b585d WatchSource:0}: Error finding container 9672a034880057ddddebbd8d5025c2f5ff392ad10353c79a2dc95fc0552b585d: Status 404 returned error can't find the container with id 9672a034880057ddddebbd8d5025c2f5ff392ad10353c79a2dc95fc0552b585d Mar 07 06:51:44 crc kubenswrapper[4941]: W0307 06:51:44.443459 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-427f5f537ec758cd606489199414fb28567e3d0d2ed3fb4f8f4a9dd148fcd532 WatchSource:0}: Error finding container 427f5f537ec758cd606489199414fb28567e3d0d2ed3fb4f8f4a9dd148fcd532: Status 404 returned error can't find the container with id 427f5f537ec758cd606489199414fb28567e3d0d2ed3fb4f8f4a9dd148fcd532 Mar 07 06:51:44 crc kubenswrapper[4941]: W0307 06:51:44.446786 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-cce3bc0bbb46ee06ada6cd2a714cff6567e3d6e7ec9a071742516c9448e2a7c4 WatchSource:0}: Error finding container cce3bc0bbb46ee06ada6cd2a714cff6567e3d6e7ec9a071742516c9448e2a7c4: Status 404 returned error can't find the container with id cce3bc0bbb46ee06ada6cd2a714cff6567e3d6e7ec9a071742516c9448e2a7c4 Mar 07 06:51:44 crc kubenswrapper[4941]: E0307 06:51:44.511318 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="800ms" Mar 07 06:51:44 crc kubenswrapper[4941]: W0307 06:51:44.719589 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:44 crc kubenswrapper[4941]: E0307 06:51:44.719717 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.748903 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.750257 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.750309 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.750326 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.750371 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:51:44 crc kubenswrapper[4941]: E0307 06:51:44.751160 4941 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.899306 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.962924 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9672a034880057ddddebbd8d5025c2f5ff392ad10353c79a2dc95fc0552b585d"} Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.964430 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d2cb6f605e48a20785e109abb9eb1062cecf51e697a51ddbed1ff966340e1e96"} Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.965260 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d7396491129e746d70a8e07c23b1f3006df76dc016881c844fcaa2744bbbf6b1"} Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.966817 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cce3bc0bbb46ee06ada6cd2a714cff6567e3d6e7ec9a071742516c9448e2a7c4"} Mar 07 06:51:44 crc kubenswrapper[4941]: I0307 06:51:44.967820 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"427f5f537ec758cd606489199414fb28567e3d0d2ed3fb4f8f4a9dd148fcd532"} Mar 07 06:51:44 crc kubenswrapper[4941]: W0307 06:51:44.986846 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:44 crc kubenswrapper[4941]: E0307 06:51:44.986959 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:51:44 crc kubenswrapper[4941]: W0307 06:51:44.999764 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:44 crc kubenswrapper[4941]: E0307 06:51:44.999824 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:51:45 crc kubenswrapper[4941]: E0307 06:51:45.312471 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="1.6s" Mar 07 06:51:45 crc kubenswrapper[4941]: W0307 06:51:45.501665 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:45 crc kubenswrapper[4941]: E0307 06:51:45.501838 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.551798 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.554966 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.555013 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.555022 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.555049 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:51:45 crc kubenswrapper[4941]: E0307 06:51:45.555612 4941 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.898964 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.933572 4941 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 06:51:45 crc kubenswrapper[4941]: E0307 06:51:45.934718 4941 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.972653 4941 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb" exitCode=0 Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.972780 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb"} Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.972859 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.974291 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.974337 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.974346 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.974792 4941 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba" exitCode=0 Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.974869 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba"} Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.974980 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.976187 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.976302 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.976367 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.976382 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.977579 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.977648 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.977670 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.977841 4941 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="db2929871f62a01c2af78ea47ee01f9e9f9d691456befae41a502a92eda56ec5" exitCode=0 Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.977980 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.978009 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"db2929871f62a01c2af78ea47ee01f9e9f9d691456befae41a502a92eda56ec5"} Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.979147 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.979187 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.979199 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.980150 4941 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143" exitCode=0 Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.980214 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143"} Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.980278 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.981483 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.981513 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.981523 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.983242 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548"} Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.983274 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02"} Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.983286 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990"} Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.983294 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d"} Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.983402 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.984115 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.984153 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:45 crc kubenswrapper[4941]: I0307 06:51:45.984168 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:46 crc kubenswrapper[4941]: W0307 06:51:46.749747 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:46 crc kubenswrapper[4941]: E0307 06:51:46.749860 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:51:46 crc kubenswrapper[4941]: W0307 06:51:46.852778 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:46 crc kubenswrapper[4941]: E0307 06:51:46.852862 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.899327 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:46 crc kubenswrapper[4941]: E0307 06:51:46.915965 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="3.2s" Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.991592 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e5b6f4484dc1d227d69c21f0d44cda7a24648e102571804f512428de36b1d53a"} Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.991640 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7d58b1c0b016385b1f6474ecda4ac3f92786ad41f9fc2d2f8f4af0ff418608a6"} Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.991650 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0e4f03c38ce90dccca05dd6e447d9edc485d4c85b0aa2f3206a561868bf7c66b"} Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.991669 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.992797 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.992854 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.992872 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.996315 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f"} Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.996370 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96"} Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.996392 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148"} Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.996430 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76"} Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.998862 4941 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863" exitCode=0 Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.998971 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.999037 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863"} Mar 07 06:51:46 crc kubenswrapper[4941]: I0307 06:51:46.999654 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.000158 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.000187 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.000197 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.002357 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.002370 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.002333 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ec162924f481128c0d328fb9f75d9e38e9942d841b84d37bebdd2006f6d3f9ee"} Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.003391 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.003435 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.003449 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.003448 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.003610 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.007786 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.156172 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.158520 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.158621 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.158682 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:47 crc kubenswrapper[4941]: I0307 06:51:47.158746 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:51:47 crc kubenswrapper[4941]: E0307 06:51:47.159137 4941 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 07 06:51:47 crc kubenswrapper[4941]: W0307 06:51:47.437975 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 07 06:51:47 crc kubenswrapper[4941]: E0307 06:51:47.438530 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.008940 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0a437d055dc9415c1ad782193e7bcd9602cfc60894f0414e168ae088ab985ff1"} Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.009029 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.009924 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.009986 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.010004 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.012162 4941 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4" exitCode=0 Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.012207 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4"} Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.012288 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.012301 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.012316 4941 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.012366 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.012296 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.014006 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.014048 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.014054 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.014080 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.014092 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.014111 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.014061 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.014165 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.014183 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.014573 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.014607 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:48 crc kubenswrapper[4941]: I0307 06:51:48.014621 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.019333 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a"} Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.019406 4941 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.019507 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.020243 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd"} Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.020389 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee"} Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.020439 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a"} Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.020758 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.020785 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.020795 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.585984 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.586262 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.588050 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.588136 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.588148 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.595856 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:51:49 crc kubenswrapper[4941]: I0307 06:51:49.790848 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.029649 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235"} Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.029710 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.029810 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.031686 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.031727 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.031740 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.032143 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.032215 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.032238 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.323674 4941 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.359326 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.361126 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.361165 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.361176 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:50 crc kubenswrapper[4941]: I0307 06:51:50.361206 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.032246 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.032291 4941 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.032380 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.033215 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.033252 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.033264 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.034065 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.034132 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.034154 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.539854 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.540236 4941 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.540317 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.542212 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.542268 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.542280 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:51 crc kubenswrapper[4941]: I0307 06:51:51.971968 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.035389 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.036902 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.036959 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.036971 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.288998 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.289260 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.291178 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.291233 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.291243 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.520874 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.521087 4941 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.521135 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.522567 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.522598 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.522613 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.791040 4941 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 06:51:52 crc kubenswrapper[4941]: I0307 06:51:52.791177 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 07 06:51:53 crc kubenswrapper[4941]: I0307 06:51:53.460829 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:51:53 crc kubenswrapper[4941]: I0307 06:51:53.461066 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:53 crc kubenswrapper[4941]: I0307 06:51:53.462595 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:53 crc kubenswrapper[4941]: I0307 06:51:53.462651 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:53 crc kubenswrapper[4941]: I0307 06:51:53.462662 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:53 crc kubenswrapper[4941]: I0307 06:51:53.720496 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:51:53 crc kubenswrapper[4941]: I0307 06:51:53.720770 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:53 crc kubenswrapper[4941]: I0307 06:51:53.722721 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:53 crc kubenswrapper[4941]: I0307 06:51:53.722778 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:53 crc kubenswrapper[4941]: I0307 06:51:53.722791 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:54 crc kubenswrapper[4941]: E0307 06:51:54.054363 4941 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 06:51:57 crc kubenswrapper[4941]: I0307 06:51:57.899438 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 07 06:51:58 crc kubenswrapper[4941]: W0307 06:51:58.205482 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 07 06:51:58 crc kubenswrapper[4941]: I0307 06:51:58.205581 4941 trace.go:236] Trace[1855801133]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Mar-2026 06:51:48.203) (total time: 10002ms): Mar 07 06:51:58 crc kubenswrapper[4941]: Trace[1855801133]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:51:58.205) Mar 07 06:51:58 crc kubenswrapper[4941]: Trace[1855801133]: [10.00207928s] [10.00207928s] END Mar 07 06:51:58 crc kubenswrapper[4941]: E0307 06:51:58.205610 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 07 06:51:58 crc kubenswrapper[4941]: W0307 06:51:58.827497 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:51:58Z is after 2026-02-23T05:33:13Z Mar 07 06:51:58 crc kubenswrapper[4941]: E0307 06:51:58.827606 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:51:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 06:51:58 crc kubenswrapper[4941]: W0307 06:51:58.831847 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:51:58Z is after 2026-02-23T05:33:13Z Mar 07 06:51:58 crc kubenswrapper[4941]: E0307 06:51:58.831956 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:51:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 06:51:58 crc kubenswrapper[4941]: W0307 06:51:58.833204 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:51:58Z is after 2026-02-23T05:33:13Z Mar 07 06:51:58 crc kubenswrapper[4941]: E0307 06:51:58.833275 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:51:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 06:51:58 crc kubenswrapper[4941]: E0307 06:51:58.835281 4941 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:51:58Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189a7c82a5fda0c7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.896166599 +0000 UTC m=+0.848532064,LastTimestamp:2026-03-07 06:51:43.896166599 +0000 UTC m=+0.848532064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:51:58 crc kubenswrapper[4941]: E0307 06:51:58.836481 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:51:58Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 07 06:51:58 crc kubenswrapper[4941]: E0307 06:51:58.838013 4941 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:51:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 06:51:58 crc kubenswrapper[4941]: E0307 06:51:58.838880 4941 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:51:58Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 06:51:58 crc kubenswrapper[4941]: I0307 06:51:58.848857 4941 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 07 06:51:58 crc kubenswrapper[4941]: I0307 06:51:58.848943 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 07 06:51:58 crc kubenswrapper[4941]: I0307 06:51:58.855090 4941 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 07 06:51:58 crc kubenswrapper[4941]: I0307 06:51:58.855155 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 07 06:51:58 crc kubenswrapper[4941]: I0307 06:51:58.901791 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:51:58Z is after 2026-02-23T05:33:13Z Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.055086 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.057190 4941 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0a437d055dc9415c1ad782193e7bcd9602cfc60894f0414e168ae088ab985ff1" exitCode=255 Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.057273 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0a437d055dc9415c1ad782193e7bcd9602cfc60894f0414e168ae088ab985ff1"} Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.057484 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.058523 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.058575 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.058587 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.059183 4941 scope.go:117] "RemoveContainer" containerID="0a437d055dc9415c1ad782193e7bcd9602cfc60894f0414e168ae088ab985ff1" Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.145375 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.145661 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.147258 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.147317 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.147332 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.189161 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 07 06:51:59 crc kubenswrapper[4941]: I0307 06:51:59.901261 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:51:59Z is after 2026-02-23T05:33:13Z Mar 07 06:52:00 crc kubenswrapper[4941]: I0307 06:52:00.063398 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 07 06:52:00 crc kubenswrapper[4941]: I0307 06:52:00.066318 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2ce424eecb5c066248e6164e7911b9e44aef15465773004c03e4eb539f2cef77"} Mar 07 06:52:00 crc kubenswrapper[4941]: I0307 06:52:00.066520 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:00 crc kubenswrapper[4941]: I0307 06:52:00.066622 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:00 crc kubenswrapper[4941]: I0307 06:52:00.067992 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:00 crc kubenswrapper[4941]: I0307 06:52:00.068030 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:00 crc kubenswrapper[4941]: I0307 06:52:00.068039 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:00 crc kubenswrapper[4941]: I0307 06:52:00.068068 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:00 crc kubenswrapper[4941]: I0307 06:52:00.068105 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:00 crc kubenswrapper[4941]: I0307 06:52:00.068116 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:00 crc kubenswrapper[4941]: I0307 06:52:00.078079 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 07 06:52:00 crc kubenswrapper[4941]: I0307 06:52:00.901357 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:52:00Z is after 2026-02-23T05:33:13Z Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.071989 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.072779 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.075485 4941 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2ce424eecb5c066248e6164e7911b9e44aef15465773004c03e4eb539f2cef77" exitCode=255 Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.075587 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2ce424eecb5c066248e6164e7911b9e44aef15465773004c03e4eb539f2cef77"} Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.075707 4941 scope.go:117] "RemoveContainer" containerID="0a437d055dc9415c1ad782193e7bcd9602cfc60894f0414e168ae088ab985ff1" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.075775 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.075899 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.077339 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.077417 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.077435 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.077937 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.078006 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.078034 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.079070 4941 scope.go:117] "RemoveContainer" containerID="2ce424eecb5c066248e6164e7911b9e44aef15465773004c03e4eb539f2cef77" Mar 07 06:52:01 crc kubenswrapper[4941]: E0307 06:52:01.079370 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.546707 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.902120 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:52:01Z is after 2026-02-23T05:33:13Z Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.977439 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.977644 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.979343 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.979421 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:01 crc kubenswrapper[4941]: I0307 06:52:01.979438 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:02 crc kubenswrapper[4941]: I0307 06:52:02.080936 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 07 06:52:02 crc kubenswrapper[4941]: I0307 06:52:02.083189 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:02 crc kubenswrapper[4941]: I0307 06:52:02.084478 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:02 crc kubenswrapper[4941]: I0307 06:52:02.084536 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:02 crc kubenswrapper[4941]: I0307 06:52:02.084547 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:02 crc kubenswrapper[4941]: I0307 06:52:02.085152 4941 scope.go:117] "RemoveContainer" containerID="2ce424eecb5c066248e6164e7911b9e44aef15465773004c03e4eb539f2cef77" Mar 07 06:52:02 crc kubenswrapper[4941]: E0307 06:52:02.085351 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:52:02 crc kubenswrapper[4941]: I0307 06:52:02.088627 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:52:02 crc kubenswrapper[4941]: I0307 06:52:02.791822 4941 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 06:52:02 crc kubenswrapper[4941]: I0307 06:52:02.791944 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 06:52:02 crc kubenswrapper[4941]: I0307 06:52:02.901283 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:52:02Z is after 2026-02-23T05:33:13Z Mar 07 06:52:03 crc kubenswrapper[4941]: I0307 06:52:03.085898 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:03 crc kubenswrapper[4941]: I0307 06:52:03.087205 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:03 crc kubenswrapper[4941]: I0307 06:52:03.087267 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:03 crc kubenswrapper[4941]: I0307 06:52:03.087279 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:03 crc kubenswrapper[4941]: I0307 06:52:03.087990 4941 scope.go:117] "RemoveContainer" containerID="2ce424eecb5c066248e6164e7911b9e44aef15465773004c03e4eb539f2cef77" Mar 07 06:52:03 crc kubenswrapper[4941]: E0307 06:52:03.088207 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:52:03 crc kubenswrapper[4941]: W0307 06:52:03.672479 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:52:03Z is after 2026-02-23T05:33:13Z Mar 07 06:52:03 crc kubenswrapper[4941]: E0307 06:52:03.672591 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:52:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 06:52:03 crc kubenswrapper[4941]: I0307 06:52:03.720930 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:52:03 crc kubenswrapper[4941]: I0307 06:52:03.914932 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:52:03Z is after 2026-02-23T05:33:13Z Mar 07 06:52:04 crc kubenswrapper[4941]: E0307 06:52:04.054645 4941 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 06:52:04 crc kubenswrapper[4941]: I0307 06:52:04.088855 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:04 crc kubenswrapper[4941]: I0307 06:52:04.089954 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:04 crc kubenswrapper[4941]: I0307 06:52:04.089986 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:04 crc kubenswrapper[4941]: I0307 06:52:04.089994 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:04 crc kubenswrapper[4941]: I0307 06:52:04.090525 4941 scope.go:117] "RemoveContainer" containerID="2ce424eecb5c066248e6164e7911b9e44aef15465773004c03e4eb539f2cef77" Mar 07 06:52:04 crc kubenswrapper[4941]: E0307 06:52:04.090693 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:52:04 crc kubenswrapper[4941]: I0307 06:52:04.252872 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:52:04 crc kubenswrapper[4941]: I0307 06:52:04.901937 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:52:04Z is after 2026-02-23T05:33:13Z Mar 07 06:52:05 crc kubenswrapper[4941]: I0307 06:52:05.091544 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:05 crc kubenswrapper[4941]: I0307 06:52:05.092583 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:05 crc kubenswrapper[4941]: I0307 06:52:05.092639 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:05 crc kubenswrapper[4941]: I0307 06:52:05.092651 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:05 crc kubenswrapper[4941]: I0307 06:52:05.093471 4941 scope.go:117] "RemoveContainer" containerID="2ce424eecb5c066248e6164e7911b9e44aef15465773004c03e4eb539f2cef77" Mar 07 06:52:05 crc kubenswrapper[4941]: E0307 06:52:05.093670 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:52:05 crc kubenswrapper[4941]: I0307 06:52:05.239017 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:05 crc kubenswrapper[4941]: E0307 06:52:05.240531 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:52:05Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 07 06:52:05 crc kubenswrapper[4941]: I0307 06:52:05.241053 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:05 crc kubenswrapper[4941]: I0307 06:52:05.241109 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:05 crc kubenswrapper[4941]: I0307 06:52:05.241129 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:05 crc kubenswrapper[4941]: I0307 06:52:05.241167 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:52:05 crc kubenswrapper[4941]: E0307 06:52:05.244447 4941 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:52:05Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 06:52:05 crc kubenswrapper[4941]: I0307 06:52:05.906015 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:06 crc kubenswrapper[4941]: I0307 06:52:06.904263 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:07 crc kubenswrapper[4941]: I0307 06:52:07.116507 4941 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 06:52:07 crc kubenswrapper[4941]: I0307 06:52:07.129812 4941 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 07 06:52:07 crc kubenswrapper[4941]: I0307 06:52:07.907614 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:08 crc kubenswrapper[4941]: W0307 06:52:08.007677 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.007764 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.839517 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a5fda0c7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.896166599 +0000 UTC m=+0.848532064,LastTimestamp:2026-03-07 06:51:43.896166599 +0000 UTC m=+0.848532064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.843435 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a95390ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952130284 +0000 UTC m=+0.904495749,LastTimestamp:2026-03-07 06:51:43.952130284 +0000 UTC m=+0.904495749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.848160 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a953c5e4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952143844 +0000 UTC m=+0.904509309,LastTimestamp:2026-03-07 06:51:43.952143844 +0000 UTC m=+0.904509309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.853225 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a954202b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952166955 +0000 UTC m=+0.904532420,LastTimestamp:2026-03-07 06:51:43.952166955 +0000 UTC m=+0.904532420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.859418 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82aeb59b44 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:44.04244154 +0000 UTC m=+0.994807005,LastTimestamp:2026-03-07 06:51:44.04244154 +0000 UTC m=+0.994807005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.864511 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a95390ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a95390ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952130284 +0000 UTC m=+0.904495749,LastTimestamp:2026-03-07 06:51:44.055485496 +0000 UTC m=+1.007850961,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.871549 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a953c5e4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a953c5e4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952143844 +0000 UTC m=+0.904509309,LastTimestamp:2026-03-07 06:51:44.055505537 +0000 UTC m=+1.007870992,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.876822 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a954202b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a954202b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952166955 +0000 UTC m=+0.904532420,LastTimestamp:2026-03-07 06:51:44.055513357 +0000 UTC m=+1.007878812,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.883622 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a95390ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a95390ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952130284 +0000 UTC m=+0.904495749,LastTimestamp:2026-03-07 06:51:44.056506771 +0000 UTC m=+1.008872236,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.888028 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a953c5e4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a953c5e4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952143844 +0000 UTC m=+0.904509309,LastTimestamp:2026-03-07 06:51:44.056517551 +0000 UTC m=+1.008883016,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.892860 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a954202b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a954202b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952166955 +0000 UTC m=+0.904532420,LastTimestamp:2026-03-07 06:51:44.056524411 +0000 UTC m=+1.008889876,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.897115 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a95390ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a95390ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952130284 +0000 UTC m=+0.904495749,LastTimestamp:2026-03-07 06:51:44.057035844 +0000 UTC m=+1.009401309,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: I0307 06:52:08.901003 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.901127 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a953c5e4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a953c5e4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952143844 +0000 UTC m=+0.904509309,LastTimestamp:2026-03-07 06:51:44.057055194 +0000 UTC m=+1.009420659,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.905272 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a954202b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a954202b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952166955 +0000 UTC m=+0.904532420,LastTimestamp:2026-03-07 06:51:44.057086375 +0000 UTC m=+1.009451840,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.909017 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a95390ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a95390ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952130284 +0000 UTC m=+0.904495749,LastTimestamp:2026-03-07 06:51:44.057122396 +0000 UTC m=+1.009487851,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.913247 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a953c5e4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a953c5e4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952143844 +0000 UTC m=+0.904509309,LastTimestamp:2026-03-07 06:51:44.057131746 +0000 UTC m=+1.009497211,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.916654 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a954202b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a954202b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952166955 +0000 UTC m=+0.904532420,LastTimestamp:2026-03-07 06:51:44.057138816 +0000 UTC m=+1.009504281,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.920103 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a95390ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a95390ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952130284 +0000 UTC m=+0.904495749,LastTimestamp:2026-03-07 06:51:44.057549946 +0000 UTC m=+1.009915431,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.924641 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a953c5e4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a953c5e4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952143844 +0000 UTC m=+0.904509309,LastTimestamp:2026-03-07 06:51:44.057580717 +0000 UTC m=+1.009946202,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.928871 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a954202b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a954202b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952166955 +0000 UTC m=+0.904532420,LastTimestamp:2026-03-07 06:51:44.057594777 +0000 UTC m=+1.009960242,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.932843 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a95390ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a95390ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952130284 +0000 UTC m=+0.904495749,LastTimestamp:2026-03-07 06:51:44.058157491 +0000 UTC m=+1.010522956,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.937246 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a95390ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a95390ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952130284 +0000 UTC m=+0.904495749,LastTimestamp:2026-03-07 06:51:44.058169361 +0000 UTC m=+1.010534826,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.938865 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a953c5e4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a953c5e4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952143844 +0000 UTC m=+0.904509309,LastTimestamp:2026-03-07 06:51:44.058173751 +0000 UTC m=+1.010539216,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.942512 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a953c5e4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a953c5e4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952143844 +0000 UTC m=+0.904509309,LastTimestamp:2026-03-07 06:51:44.058184321 +0000 UTC m=+1.010549786,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.946048 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7c82a954202b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7c82a954202b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:43.952166955 +0000 UTC m=+0.904532420,LastTimestamp:2026-03-07 06:51:44.058195052 +0000 UTC m=+1.010560517,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.952003 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c82c62a7520 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:44.435975456 +0000 UTC m=+1.388340921,LastTimestamp:2026-03-07 06:51:44.435975456 +0000 UTC m=+1.388340921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.955631 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7c82c62c0dd9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:44.436080089 +0000 UTC m=+1.388445554,LastTimestamp:2026-03-07 06:51:44.436080089 +0000 UTC m=+1.388445554,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.960430 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c82c671a963 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:44.440641891 +0000 UTC m=+1.393007356,LastTimestamp:2026-03-07 06:51:44.440641891 +0000 UTC m=+1.393007356,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.965198 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c82c6ef30fb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:44.448868603 +0000 UTC m=+1.401234078,LastTimestamp:2026-03-07 06:51:44.448868603 +0000 UTC m=+1.401234078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.970465 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7c82c73e6e0c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:44.45406158 +0000 UTC m=+1.406427045,LastTimestamp:2026-03-07 06:51:44.45406158 +0000 UTC m=+1.406427045,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.975226 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c82ee160d78 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.10572684 +0000 UTC m=+2.058092325,LastTimestamp:2026-03-07 06:51:45.10572684 +0000 UTC m=+2.058092325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.981099 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c82ee193057 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.105932375 +0000 UTC m=+2.058297860,LastTimestamp:2026-03-07 06:51:45.105932375 +0000 UTC m=+2.058297860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.985902 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7c82ee1b1c44 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.106058308 +0000 UTC m=+2.058423773,LastTimestamp:2026-03-07 06:51:45.106058308 +0000 UTC m=+2.058423773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.990202 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c82ee361ebc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.107828412 +0000 UTC m=+2.060193877,LastTimestamp:2026-03-07 06:51:45.107828412 +0000 UTC m=+2.060193877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:08 crc kubenswrapper[4941]: E0307 06:52:08.994691 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7c82ee3636b8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.107834552 +0000 UTC m=+2.060200027,LastTimestamp:2026-03-07 06:51:45.107834552 +0000 UTC m=+2.060200027,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.000759 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c82eed8b3d5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.118483413 +0000 UTC m=+2.070848888,LastTimestamp:2026-03-07 06:51:45.118483413 +0000 UTC m=+2.070848888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.007528 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7c82eeeeadca openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.119923658 +0000 UTC m=+2.072289123,LastTimestamp:2026-03-07 06:51:45.119923658 +0000 UTC m=+2.072289123,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.011929 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c82eef5301f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.120350239 +0000 UTC m=+2.072715724,LastTimestamp:2026-03-07 06:51:45.120350239 +0000 UTC m=+2.072715724,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.017154 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7c82ef08a786 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.12162599 +0000 UTC m=+2.073991455,LastTimestamp:2026-03-07 06:51:45.12162599 +0000 UTC m=+2.073991455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.022264 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c82ef0c1462 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.121850466 +0000 UTC m=+2.074215941,LastTimestamp:2026-03-07 06:51:45.121850466 +0000 UTC m=+2.074215941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.027048 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c82ef3d5f85 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.125080965 +0000 UTC m=+2.077446430,LastTimestamp:2026-03-07 06:51:45.125080965 +0000 UTC m=+2.077446430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.030671 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c8301c91c39 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.436228665 +0000 UTC m=+2.388594160,LastTimestamp:2026-03-07 06:51:45.436228665 +0000 UTC m=+2.388594160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.035721 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c830261efd7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.446244311 +0000 UTC m=+2.398609786,LastTimestamp:2026-03-07 06:51:45.446244311 +0000 UTC m=+2.398609786,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.040143 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c83027a46da openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.44783945 +0000 UTC m=+2.400204925,LastTimestamp:2026-03-07 06:51:45.44783945 +0000 UTC m=+2.400204925,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.045785 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c830dfd6664 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.640982116 +0000 UTC m=+2.593347581,LastTimestamp:2026-03-07 06:51:45.640982116 +0000 UTC m=+2.593347581,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.053345 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c830eae590a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.65257857 +0000 UTC m=+2.604944075,LastTimestamp:2026-03-07 06:51:45.65257857 +0000 UTC m=+2.604944075,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.058264 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c830ec66a7b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.654155899 +0000 UTC m=+2.606521364,LastTimestamp:2026-03-07 06:51:45.654155899 +0000 UTC m=+2.606521364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.062887 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c831a088d35 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.843039541 +0000 UTC m=+2.795405006,LastTimestamp:2026-03-07 06:51:45.843039541 +0000 UTC m=+2.795405006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.067896 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c831ad6205b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.856512091 +0000 UTC m=+2.808877556,LastTimestamp:2026-03-07 06:51:45.856512091 +0000 UTC m=+2.808877556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.072537 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c8321f355fe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.975866878 +0000 UTC m=+2.928232343,LastTimestamp:2026-03-07 06:51:45.975866878 +0000 UTC m=+2.928232343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.076972 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83221f9331 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.978766129 +0000 UTC m=+2.931131594,LastTimestamp:2026-03-07 06:51:45.978766129 +0000 UTC m=+2.931131594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.080883 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7c8322387c07 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.980398599 +0000 UTC m=+2.932764084,LastTimestamp:2026-03-07 06:51:45.980398599 +0000 UTC m=+2.932764084,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.084585 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7c832257a1a9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.982439849 +0000 UTC m=+2.934805334,LastTimestamp:2026-03-07 06:51:45.982439849 +0000 UTC m=+2.934805334,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.088044 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7c833172cb5a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.235878234 +0000 UTC m=+3.188243699,LastTimestamp:2026-03-07 06:51:46.235878234 +0000 UTC m=+3.188243699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.092035 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c8331c5de70 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.241322608 +0000 UTC m=+3.193688073,LastTimestamp:2026-03-07 06:51:46.241322608 +0000 UTC m=+3.193688073,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.095519 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7c8331cc3090 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.241736848 +0000 UTC m=+3.194102313,LastTimestamp:2026-03-07 06:51:46.241736848 +0000 UTC m=+3.194102313,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.098810 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c8331dbd685 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.242762373 +0000 UTC m=+3.195127838,LastTimestamp:2026-03-07 06:51:46.242762373 +0000 UTC m=+3.195127838,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.102160 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7c83325ab765 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.251077477 +0000 UTC m=+3.203442942,LastTimestamp:2026-03-07 06:51:46.251077477 +0000 UTC m=+3.203442942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.106607 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7c83327466ca openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.252760778 +0000 UTC m=+3.205126243,LastTimestamp:2026-03-07 06:51:46.252760778 +0000 UTC m=+3.205126243,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.110320 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7c8332b2d89d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.256853149 +0000 UTC m=+3.209218614,LastTimestamp:2026-03-07 06:51:46.256853149 +0000 UTC m=+3.209218614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.114667 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c8333238f4e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.26423995 +0000 UTC m=+3.216605415,LastTimestamp:2026-03-07 06:51:46.26423995 +0000 UTC m=+3.216605415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.118591 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c8333338147 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.265284935 +0000 UTC m=+3.217650410,LastTimestamp:2026-03-07 06:51:46.265284935 +0000 UTC m=+3.217650410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.122807 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83346d3da0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.28584592 +0000 UTC m=+3.238211395,LastTimestamp:2026-03-07 06:51:46.28584592 +0000 UTC m=+3.238211395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.126945 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7c833f6ab3cd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.470228941 +0000 UTC m=+3.422594406,LastTimestamp:2026-03-07 06:51:46.470228941 +0000 UTC m=+3.422594406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.131694 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c833fa90be1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.474314721 +0000 UTC m=+3.426680186,LastTimestamp:2026-03-07 06:51:46.474314721 +0000 UTC m=+3.426680186,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.135113 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7c83400e00a8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.480930984 +0000 UTC m=+3.433296449,LastTimestamp:2026-03-07 06:51:46.480930984 +0000 UTC m=+3.433296449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.139624 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7c8340202b15 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.482121493 +0000 UTC m=+3.434486958,LastTimestamp:2026-03-07 06:51:46.482121493 +0000 UTC m=+3.434486958,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.144711 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c8340542814 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.485528596 +0000 UTC m=+3.437894071,LastTimestamp:2026-03-07 06:51:46.485528596 +0000 UTC m=+3.437894071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.146251 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c8340992b95 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.490051477 +0000 UTC m=+3.442416942,LastTimestamp:2026-03-07 06:51:46.490051477 +0000 UTC m=+3.442416942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.150204 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7c834bac7502 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.675864834 +0000 UTC m=+3.628230299,LastTimestamp:2026-03-07 06:51:46.675864834 +0000 UTC m=+3.628230299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.154367 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c834baf1be4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.676038628 +0000 UTC m=+3.628404093,LastTimestamp:2026-03-07 06:51:46.676038628 +0000 UTC m=+3.628404093,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.157910 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7c834c77507a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.68915929 +0000 UTC m=+3.641524765,LastTimestamp:2026-03-07 06:51:46.68915929 +0000 UTC m=+3.641524765,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.162890 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c834d39e345 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.701910853 +0000 UTC m=+3.654276318,LastTimestamp:2026-03-07 06:51:46.701910853 +0000 UTC m=+3.654276318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.167704 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c834d52706c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.703519852 +0000 UTC m=+3.655885317,LastTimestamp:2026-03-07 06:51:46.703519852 +0000 UTC m=+3.655885317,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.172858 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c835789318a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.874880394 +0000 UTC m=+3.827245849,LastTimestamp:2026-03-07 06:51:46.874880394 +0000 UTC m=+3.827245849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.178781 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c83583c5254 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.886619732 +0000 UTC m=+3.838985197,LastTimestamp:2026-03-07 06:51:46.886619732 +0000 UTC m=+3.838985197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.183500 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c83584fec94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.887904404 +0000 UTC m=+3.840269869,LastTimestamp:2026-03-07 06:51:46.887904404 +0000 UTC m=+3.840269869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.188528 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c835f2782a7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:47.002696359 +0000 UTC m=+3.955061834,LastTimestamp:2026-03-07 06:51:47.002696359 +0000 UTC m=+3.955061834,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.192991 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c836455280b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:47.089573899 +0000 UTC m=+4.041939364,LastTimestamp:2026-03-07 06:51:47.089573899 +0000 UTC m=+4.041939364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.196052 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c8364e9d69e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:47.099317918 +0000 UTC m=+4.051683383,LastTimestamp:2026-03-07 06:51:47.099317918 +0000 UTC m=+4.051683383,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.201113 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c836b4dacfd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:47.206524157 +0000 UTC m=+4.158889622,LastTimestamp:2026-03-07 06:51:47.206524157 +0000 UTC m=+4.158889622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.204607 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c836c0d2f09 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:47.219074825 +0000 UTC m=+4.171440290,LastTimestamp:2026-03-07 06:51:47.219074825 +0000 UTC m=+4.171440290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.210037 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c839b8e1fac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:48.016054188 +0000 UTC m=+4.968419693,LastTimestamp:2026-03-07 06:51:48.016054188 +0000 UTC m=+4.968419693,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.216950 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83a7aaf975 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:48.219271541 +0000 UTC m=+5.171636996,LastTimestamp:2026-03-07 06:51:48.219271541 +0000 UTC m=+5.171636996,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.221158 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83a856ad9d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:48.230524317 +0000 UTC m=+5.182889782,LastTimestamp:2026-03-07 06:51:48.230524317 +0000 UTC m=+5.182889782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.224958 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83a86b738e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:48.23188571 +0000 UTC m=+5.184251165,LastTimestamp:2026-03-07 06:51:48.23188571 +0000 UTC m=+5.184251165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.229143 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83b2ada36d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:48.403995501 +0000 UTC m=+5.356360986,LastTimestamp:2026-03-07 06:51:48.403995501 +0000 UTC m=+5.356360986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.232445 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83b37c5737 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:48.417541943 +0000 UTC m=+5.369907398,LastTimestamp:2026-03-07 06:51:48.417541943 +0000 UTC m=+5.369907398,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.235997 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83b38a7004 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:48.418465796 +0000 UTC m=+5.370831261,LastTimestamp:2026-03-07 06:51:48.418465796 +0000 UTC m=+5.370831261,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.239184 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83c23a83fd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:48.664886269 +0000 UTC m=+5.617251744,LastTimestamp:2026-03-07 06:51:48.664886269 +0000 UTC m=+5.617251744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.242875 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83c343f23f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:48.682281535 +0000 UTC m=+5.634647010,LastTimestamp:2026-03-07 06:51:48.682281535 +0000 UTC m=+5.634647010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.247866 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83c35574c7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:48.683429063 +0000 UTC m=+5.635794548,LastTimestamp:2026-03-07 06:51:48.683429063 +0000 UTC m=+5.635794548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.249618 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83d1bc24ca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:48.925039818 +0000 UTC m=+5.877405323,LastTimestamp:2026-03-07 06:51:48.925039818 +0000 UTC m=+5.877405323,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.253428 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83d295aed4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:48.939296468 +0000 UTC m=+5.891661943,LastTimestamp:2026-03-07 06:51:48.939296468 +0000 UTC m=+5.891661943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.257074 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83d2b0ef68 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:48.941082472 +0000 UTC m=+5.893447937,LastTimestamp:2026-03-07 06:51:48.941082472 +0000 UTC m=+5.893447937,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.261505 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83dccb8f9b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:49.110599579 +0000 UTC m=+6.062965044,LastTimestamp:2026-03-07 06:51:49.110599579 +0000 UTC m=+6.062965044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.265840 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7c83dd8bc0d9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:49.123195097 +0000 UTC m=+6.075560562,LastTimestamp:2026-03-07 06:51:49.123195097 +0000 UTC m=+6.075560562,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.272155 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 07 06:52:09 crc kubenswrapper[4941]: &Event{ObjectMeta:{kube-controller-manager-crc.189a7c84b82c23c8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 07 06:52:09 crc kubenswrapper[4941]: body: Mar 07 06:52:09 crc kubenswrapper[4941]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:52.791139272 +0000 UTC m=+9.743504737,LastTimestamp:2026-03-07 06:51:52.791139272 +0000 UTC m=+9.743504737,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 06:52:09 crc kubenswrapper[4941]: > Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.275924 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c84b82d711c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:52.791224604 +0000 UTC m=+9.743590069,LastTimestamp:2026-03-07 06:51:52.791224604 +0000 UTC m=+9.743590069,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.279991 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 07 06:52:09 crc kubenswrapper[4941]: &Event{ObjectMeta:{kube-apiserver-crc.189a7c86213e8fbb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 07 06:52:09 crc kubenswrapper[4941]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 07 06:52:09 crc kubenswrapper[4941]: Mar 07 06:52:09 crc kubenswrapper[4941]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:58.848921531 +0000 UTC m=+15.801286996,LastTimestamp:2026-03-07 06:51:58.848921531 +0000 UTC m=+15.801286996,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 06:52:09 crc kubenswrapper[4941]: > Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.283570 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c86213f56cc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:58.848972492 +0000 UTC m=+15.801337967,LastTimestamp:2026-03-07 06:51:58.848972492 +0000 UTC m=+15.801337967,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.290072 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189a7c86213e8fbb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 07 06:52:09 crc kubenswrapper[4941]: &Event{ObjectMeta:{kube-apiserver-crc.189a7c86213e8fbb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 07 06:52:09 crc kubenswrapper[4941]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 07 06:52:09 crc kubenswrapper[4941]: Mar 07 06:52:09 crc kubenswrapper[4941]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:58.848921531 +0000 UTC m=+15.801286996,LastTimestamp:2026-03-07 06:51:58.855137753 +0000 UTC m=+15.807503218,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 06:52:09 crc kubenswrapper[4941]: > Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.293875 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189a7c86213f56cc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c86213f56cc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:58.848972492 +0000 UTC m=+15.801337967,LastTimestamp:2026-03-07 06:51:58.855182024 +0000 UTC m=+15.807547489,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.298835 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189a7c83584fec94\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c83584fec94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:46.887904404 +0000 UTC m=+3.840269869,LastTimestamp:2026-03-07 06:51:59.061016262 +0000 UTC m=+16.013381727,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.304298 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189a7c836455280b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c836455280b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:47.089573899 +0000 UTC m=+4.041939364,LastTimestamp:2026-03-07 06:51:59.264973834 +0000 UTC m=+16.217339299,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.310096 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189a7c8364e9d69e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7c8364e9d69e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:47.099317918 +0000 UTC m=+4.051683383,LastTimestamp:2026-03-07 06:51:59.27258642 +0000 UTC m=+16.224951875,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.315234 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 07 06:52:09 crc kubenswrapper[4941]: &Event{ObjectMeta:{kube-controller-manager-crc.189a7c870c43cacb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 07 06:52:09 crc kubenswrapper[4941]: body: Mar 07 06:52:09 crc kubenswrapper[4941]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:52:02.791910091 +0000 UTC m=+19.744275556,LastTimestamp:2026-03-07 06:52:02.791910091 +0000 UTC m=+19.744275556,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 06:52:09 crc kubenswrapper[4941]: > Mar 07 06:52:09 crc kubenswrapper[4941]: E0307 06:52:09.318456 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c870c45165d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:52:02.791994973 +0000 UTC m=+19.744360468,LastTimestamp:2026-03-07 06:52:02.791994973 +0000 UTC m=+19.744360468,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:09 crc kubenswrapper[4941]: I0307 06:52:09.903024 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:10 crc kubenswrapper[4941]: W0307 06:52:10.344673 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:10 crc kubenswrapper[4941]: E0307 06:52:10.344781 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 07 06:52:10 crc kubenswrapper[4941]: I0307 06:52:10.902520 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:11 crc kubenswrapper[4941]: W0307 06:52:11.169982 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 07 06:52:11 crc kubenswrapper[4941]: E0307 06:52:11.170031 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 07 06:52:11 crc kubenswrapper[4941]: W0307 06:52:11.694041 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 07 06:52:11 crc kubenswrapper[4941]: E0307 06:52:11.694153 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 07 06:52:11 crc kubenswrapper[4941]: I0307 06:52:11.904880 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.245063 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.246722 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.246805 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.246816 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.246853 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:52:12 crc kubenswrapper[4941]: E0307 06:52:12.246993 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 06:52:12 crc kubenswrapper[4941]: E0307 06:52:12.252551 4941 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.792136 4941 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.792280 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.792359 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.792636 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.793919 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.793964 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.793973 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.794457 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.794632 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990" gracePeriod=30 Mar 07 06:52:12 crc kubenswrapper[4941]: E0307 06:52:12.797502 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a7c870c43cacb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 07 06:52:12 crc kubenswrapper[4941]: &Event{ObjectMeta:{kube-controller-manager-crc.189a7c870c43cacb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 07 06:52:12 crc kubenswrapper[4941]: body: Mar 07 06:52:12 crc kubenswrapper[4941]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:52:02.791910091 +0000 UTC m=+19.744275556,LastTimestamp:2026-03-07 06:52:12.792241028 +0000 UTC m=+29.744606513,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 06:52:12 crc kubenswrapper[4941]: > Mar 07 06:52:12 crc kubenswrapper[4941]: E0307 06:52:12.803357 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a7c870c45165d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c870c45165d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:52:02.791994973 +0000 UTC m=+19.744360468,LastTimestamp:2026-03-07 06:52:12.792309599 +0000 UTC m=+29.744675084,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:12 crc kubenswrapper[4941]: E0307 06:52:12.808562 4941 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c896078f70e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:52:12.794615566 +0000 UTC m=+29.746981031,LastTimestamp:2026-03-07 06:52:12.794615566 +0000 UTC m=+29.746981031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:12 crc kubenswrapper[4941]: I0307 06:52:12.902681 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:13 crc kubenswrapper[4941]: E0307 06:52:13.567895 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a7c82eef5301f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c82eef5301f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.120350239 +0000 UTC m=+2.072715724,LastTimestamp:2026-03-07 06:52:13.562604339 +0000 UTC m=+30.514969804,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:13 crc kubenswrapper[4941]: I0307 06:52:13.906118 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:14 crc kubenswrapper[4941]: E0307 06:52:14.054811 4941 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 06:52:14 crc kubenswrapper[4941]: E0307 06:52:14.087540 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a7c8301c91c39\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c8301c91c39 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.436228665 +0000 UTC m=+2.388594160,LastTimestamp:2026-03-07 06:52:14.082296123 +0000 UTC m=+31.034661598,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:14 crc kubenswrapper[4941]: I0307 06:52:14.118563 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 07 06:52:14 crc kubenswrapper[4941]: I0307 06:52:14.119031 4941 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990" exitCode=255 Mar 07 06:52:14 crc kubenswrapper[4941]: I0307 06:52:14.119078 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990"} Mar 07 06:52:14 crc kubenswrapper[4941]: I0307 06:52:14.119109 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be"} Mar 07 06:52:14 crc kubenswrapper[4941]: E0307 06:52:14.172769 4941 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a7c830261efd7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7c830261efd7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:51:45.446244311 +0000 UTC m=+2.398609786,LastTimestamp:2026-03-07 06:52:14.166770355 +0000 UTC m=+31.119135820,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:52:14 crc kubenswrapper[4941]: I0307 06:52:14.904954 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:15 crc kubenswrapper[4941]: I0307 06:52:15.121539 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:15 crc kubenswrapper[4941]: I0307 06:52:15.122893 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:15 crc kubenswrapper[4941]: I0307 06:52:15.122944 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:15 crc kubenswrapper[4941]: I0307 06:52:15.122954 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:15 crc kubenswrapper[4941]: I0307 06:52:15.902816 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:15 crc kubenswrapper[4941]: I0307 06:52:15.954579 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:15 crc kubenswrapper[4941]: I0307 06:52:15.955796 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:15 crc kubenswrapper[4941]: I0307 06:52:15.955833 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:15 crc kubenswrapper[4941]: I0307 06:52:15.955842 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:15 crc kubenswrapper[4941]: I0307 06:52:15.956515 4941 scope.go:117] "RemoveContainer" containerID="2ce424eecb5c066248e6164e7911b9e44aef15465773004c03e4eb539f2cef77" Mar 07 06:52:16 crc kubenswrapper[4941]: I0307 06:52:16.904097 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:17 crc kubenswrapper[4941]: I0307 06:52:17.000473 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:52:17 crc kubenswrapper[4941]: I0307 06:52:17.000668 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:17 crc kubenswrapper[4941]: I0307 06:52:17.001961 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:17 crc kubenswrapper[4941]: I0307 06:52:17.002034 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:17 crc kubenswrapper[4941]: I0307 06:52:17.002051 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:17 crc kubenswrapper[4941]: I0307 06:52:17.128356 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 07 06:52:17 crc kubenswrapper[4941]: I0307 06:52:17.132241 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f0f49605f28fd5785b836233ac0989fd87000c771ed5e7b8cea31121dd162d19"} Mar 07 06:52:17 crc kubenswrapper[4941]: I0307 06:52:17.132514 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:17 crc kubenswrapper[4941]: I0307 06:52:17.133582 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:17 crc kubenswrapper[4941]: I0307 06:52:17.133646 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:17 crc kubenswrapper[4941]: I0307 06:52:17.133664 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:17 crc kubenswrapper[4941]: I0307 06:52:17.903158 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:18 crc kubenswrapper[4941]: I0307 06:52:18.139854 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 07 06:52:18 crc kubenswrapper[4941]: I0307 06:52:18.140879 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 07 06:52:18 crc kubenswrapper[4941]: I0307 06:52:18.144183 4941 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f0f49605f28fd5785b836233ac0989fd87000c771ed5e7b8cea31121dd162d19" exitCode=255 Mar 07 06:52:18 crc kubenswrapper[4941]: I0307 06:52:18.144265 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f0f49605f28fd5785b836233ac0989fd87000c771ed5e7b8cea31121dd162d19"} Mar 07 06:52:18 crc kubenswrapper[4941]: I0307 06:52:18.144320 4941 scope.go:117] "RemoveContainer" containerID="2ce424eecb5c066248e6164e7911b9e44aef15465773004c03e4eb539f2cef77" Mar 07 06:52:18 crc kubenswrapper[4941]: I0307 06:52:18.144653 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:18 crc kubenswrapper[4941]: I0307 06:52:18.145893 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:18 crc kubenswrapper[4941]: I0307 06:52:18.145946 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:18 crc kubenswrapper[4941]: I0307 06:52:18.145962 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:18 crc kubenswrapper[4941]: I0307 06:52:18.146695 4941 scope.go:117] "RemoveContainer" containerID="f0f49605f28fd5785b836233ac0989fd87000c771ed5e7b8cea31121dd162d19" Mar 07 06:52:18 crc kubenswrapper[4941]: E0307 06:52:18.146908 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:52:18 crc kubenswrapper[4941]: I0307 06:52:18.903260 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:19 crc kubenswrapper[4941]: I0307 06:52:19.149435 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 07 06:52:19 crc kubenswrapper[4941]: I0307 06:52:19.253328 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:19 crc kubenswrapper[4941]: I0307 06:52:19.255194 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:19 crc kubenswrapper[4941]: I0307 06:52:19.255258 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:19 crc kubenswrapper[4941]: I0307 06:52:19.255274 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:19 crc kubenswrapper[4941]: I0307 06:52:19.255310 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:52:19 crc kubenswrapper[4941]: E0307 06:52:19.257767 4941 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 07 06:52:19 crc kubenswrapper[4941]: E0307 06:52:19.257972 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 06:52:19 crc kubenswrapper[4941]: I0307 06:52:19.791044 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:52:19 crc kubenswrapper[4941]: I0307 06:52:19.791294 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:19 crc kubenswrapper[4941]: I0307 06:52:19.793172 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:19 crc kubenswrapper[4941]: I0307 06:52:19.793252 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:19 crc kubenswrapper[4941]: I0307 06:52:19.793269 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:19 crc kubenswrapper[4941]: I0307 06:52:19.795672 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:52:19 crc kubenswrapper[4941]: I0307 06:52:19.902810 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:20 crc kubenswrapper[4941]: I0307 06:52:20.154928 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:20 crc kubenswrapper[4941]: I0307 06:52:20.156049 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:20 crc kubenswrapper[4941]: I0307 06:52:20.156098 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:20 crc kubenswrapper[4941]: I0307 06:52:20.156111 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:20 crc kubenswrapper[4941]: I0307 06:52:20.902887 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:21 crc kubenswrapper[4941]: W0307 06:52:21.755888 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 07 06:52:21 crc kubenswrapper[4941]: E0307 06:52:21.755995 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 07 06:52:21 crc kubenswrapper[4941]: I0307 06:52:21.901872 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:22 crc kubenswrapper[4941]: I0307 06:52:22.903177 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:23 crc kubenswrapper[4941]: W0307 06:52:23.366547 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:23 crc kubenswrapper[4941]: E0307 06:52:23.366622 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 07 06:52:23 crc kubenswrapper[4941]: I0307 06:52:23.720542 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:52:23 crc kubenswrapper[4941]: I0307 06:52:23.720757 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:23 crc kubenswrapper[4941]: I0307 06:52:23.723314 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:23 crc kubenswrapper[4941]: I0307 06:52:23.723373 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:23 crc kubenswrapper[4941]: I0307 06:52:23.723392 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:23 crc kubenswrapper[4941]: I0307 06:52:23.724133 4941 scope.go:117] "RemoveContainer" containerID="f0f49605f28fd5785b836233ac0989fd87000c771ed5e7b8cea31121dd162d19" Mar 07 06:52:23 crc kubenswrapper[4941]: E0307 06:52:23.724397 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:52:23 crc kubenswrapper[4941]: I0307 06:52:23.907082 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:24 crc kubenswrapper[4941]: E0307 06:52:24.055062 4941 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 06:52:24 crc kubenswrapper[4941]: I0307 06:52:24.252767 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:52:24 crc kubenswrapper[4941]: I0307 06:52:24.253059 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:24 crc kubenswrapper[4941]: I0307 06:52:24.254529 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:24 crc kubenswrapper[4941]: I0307 06:52:24.254577 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:24 crc kubenswrapper[4941]: I0307 06:52:24.254592 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:24 crc kubenswrapper[4941]: I0307 06:52:24.255385 4941 scope.go:117] "RemoveContainer" containerID="f0f49605f28fd5785b836233ac0989fd87000c771ed5e7b8cea31121dd162d19" Mar 07 06:52:24 crc kubenswrapper[4941]: E0307 06:52:24.255628 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:52:24 crc kubenswrapper[4941]: I0307 06:52:24.905485 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:25 crc kubenswrapper[4941]: I0307 06:52:25.905218 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:26 crc kubenswrapper[4941]: I0307 06:52:26.258485 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:26 crc kubenswrapper[4941]: I0307 06:52:26.259972 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:26 crc kubenswrapper[4941]: I0307 06:52:26.260046 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:26 crc kubenswrapper[4941]: I0307 06:52:26.260068 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:26 crc kubenswrapper[4941]: I0307 06:52:26.260111 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:52:26 crc kubenswrapper[4941]: E0307 06:52:26.265579 4941 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 07 06:52:26 crc kubenswrapper[4941]: E0307 06:52:26.266033 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 06:52:26 crc kubenswrapper[4941]: I0307 06:52:26.905202 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:27 crc kubenswrapper[4941]: I0307 06:52:27.003889 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:52:27 crc kubenswrapper[4941]: I0307 06:52:27.004098 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:27 crc kubenswrapper[4941]: I0307 06:52:27.005568 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:27 crc kubenswrapper[4941]: I0307 06:52:27.005621 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:27 crc kubenswrapper[4941]: I0307 06:52:27.005637 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:27 crc kubenswrapper[4941]: I0307 06:52:27.904709 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:28 crc kubenswrapper[4941]: I0307 06:52:28.902577 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:29 crc kubenswrapper[4941]: I0307 06:52:29.903593 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:30 crc kubenswrapper[4941]: I0307 06:52:30.902812 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:31 crc kubenswrapper[4941]: W0307 06:52:31.510239 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 07 06:52:31 crc kubenswrapper[4941]: E0307 06:52:31.510329 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 07 06:52:31 crc kubenswrapper[4941]: I0307 06:52:31.902861 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:32 crc kubenswrapper[4941]: I0307 06:52:32.903931 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:33 crc kubenswrapper[4941]: I0307 06:52:33.265695 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:33 crc kubenswrapper[4941]: I0307 06:52:33.267372 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:33 crc kubenswrapper[4941]: I0307 06:52:33.267429 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:33 crc kubenswrapper[4941]: I0307 06:52:33.267441 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:33 crc kubenswrapper[4941]: I0307 06:52:33.267463 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:52:33 crc kubenswrapper[4941]: E0307 06:52:33.269053 4941 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 07 06:52:33 crc kubenswrapper[4941]: E0307 06:52:33.270627 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 06:52:33 crc kubenswrapper[4941]: I0307 06:52:33.465615 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:52:33 crc kubenswrapper[4941]: I0307 06:52:33.465776 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:33 crc kubenswrapper[4941]: I0307 06:52:33.466894 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:33 crc kubenswrapper[4941]: I0307 06:52:33.466929 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:33 crc kubenswrapper[4941]: I0307 06:52:33.466939 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:33 crc kubenswrapper[4941]: I0307 06:52:33.904486 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:34 crc kubenswrapper[4941]: E0307 06:52:34.055185 4941 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 06:52:34 crc kubenswrapper[4941]: W0307 06:52:34.357330 4941 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 07 06:52:34 crc kubenswrapper[4941]: E0307 06:52:34.357473 4941 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 07 06:52:34 crc kubenswrapper[4941]: I0307 06:52:34.904248 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:35 crc kubenswrapper[4941]: I0307 06:52:35.902602 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:35 crc kubenswrapper[4941]: I0307 06:52:35.954607 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:35 crc kubenswrapper[4941]: I0307 06:52:35.955868 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:35 crc kubenswrapper[4941]: I0307 06:52:35.955903 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:35 crc kubenswrapper[4941]: I0307 06:52:35.955916 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:35 crc kubenswrapper[4941]: I0307 06:52:35.956541 4941 scope.go:117] "RemoveContainer" containerID="f0f49605f28fd5785b836233ac0989fd87000c771ed5e7b8cea31121dd162d19" Mar 07 06:52:35 crc kubenswrapper[4941]: E0307 06:52:35.956752 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:52:36 crc kubenswrapper[4941]: I0307 06:52:36.901953 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:37 crc kubenswrapper[4941]: I0307 06:52:37.903590 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:38 crc kubenswrapper[4941]: I0307 06:52:38.906495 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:39 crc kubenswrapper[4941]: I0307 06:52:39.903468 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:40 crc kubenswrapper[4941]: I0307 06:52:40.270025 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:40 crc kubenswrapper[4941]: I0307 06:52:40.271625 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:40 crc kubenswrapper[4941]: I0307 06:52:40.271720 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:40 crc kubenswrapper[4941]: I0307 06:52:40.271747 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:40 crc kubenswrapper[4941]: I0307 06:52:40.271794 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:52:40 crc kubenswrapper[4941]: E0307 06:52:40.276169 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 06:52:40 crc kubenswrapper[4941]: E0307 06:52:40.276659 4941 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 07 06:52:40 crc kubenswrapper[4941]: I0307 06:52:40.903316 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:41 crc kubenswrapper[4941]: I0307 06:52:41.905557 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:42 crc kubenswrapper[4941]: I0307 06:52:42.904114 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:43 crc kubenswrapper[4941]: I0307 06:52:43.904201 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:44 crc kubenswrapper[4941]: E0307 06:52:44.056149 4941 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 06:52:44 crc kubenswrapper[4941]: I0307 06:52:44.904663 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:45 crc kubenswrapper[4941]: I0307 06:52:45.906808 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:46 crc kubenswrapper[4941]: I0307 06:52:46.904496 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:47 crc kubenswrapper[4941]: I0307 06:52:47.276796 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:47 crc kubenswrapper[4941]: I0307 06:52:47.278951 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:47 crc kubenswrapper[4941]: I0307 06:52:47.279011 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:47 crc kubenswrapper[4941]: I0307 06:52:47.279028 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:47 crc kubenswrapper[4941]: I0307 06:52:47.279062 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:52:47 crc kubenswrapper[4941]: E0307 06:52:47.281050 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 06:52:47 crc kubenswrapper[4941]: E0307 06:52:47.281272 4941 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 07 06:52:47 crc kubenswrapper[4941]: I0307 06:52:47.903357 4941 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 06:52:48 crc kubenswrapper[4941]: I0307 06:52:48.384218 4941 csr.go:261] certificate signing request csr-ckh2s is approved, waiting to be issued Mar 07 06:52:48 crc kubenswrapper[4941]: I0307 06:52:48.393510 4941 csr.go:257] certificate signing request csr-ckh2s is issued Mar 07 06:52:48 crc kubenswrapper[4941]: I0307 06:52:48.438514 4941 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 07 06:52:48 crc kubenswrapper[4941]: I0307 06:52:48.755231 4941 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 07 06:52:48 crc kubenswrapper[4941]: I0307 06:52:48.954084 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:48 crc kubenswrapper[4941]: I0307 06:52:48.956099 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:48 crc kubenswrapper[4941]: I0307 06:52:48.956163 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:48 crc kubenswrapper[4941]: I0307 06:52:48.956173 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:48 crc kubenswrapper[4941]: I0307 06:52:48.957133 4941 scope.go:117] "RemoveContainer" containerID="f0f49605f28fd5785b836233ac0989fd87000c771ed5e7b8cea31121dd162d19" Mar 07 06:52:49 crc kubenswrapper[4941]: I0307 06:52:49.237802 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 07 06:52:49 crc kubenswrapper[4941]: I0307 06:52:49.239616 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd"} Mar 07 06:52:49 crc kubenswrapper[4941]: I0307 06:52:49.239784 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:49 crc kubenswrapper[4941]: I0307 06:52:49.240852 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:49 crc kubenswrapper[4941]: I0307 06:52:49.240877 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:49 crc kubenswrapper[4941]: I0307 06:52:49.240886 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:49 crc kubenswrapper[4941]: I0307 06:52:49.394835 4941 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-25 10:55:52.232872421 +0000 UTC Mar 07 06:52:49 crc kubenswrapper[4941]: I0307 06:52:49.394900 4941 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7036h3m2.837978038s for next certificate rotation Mar 07 06:52:50 crc kubenswrapper[4941]: I0307 06:52:50.244514 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 07 06:52:50 crc kubenswrapper[4941]: I0307 06:52:50.245460 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 07 06:52:50 crc kubenswrapper[4941]: I0307 06:52:50.248131 4941 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd" exitCode=255 Mar 07 06:52:50 crc kubenswrapper[4941]: I0307 06:52:50.248186 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd"} Mar 07 06:52:50 crc kubenswrapper[4941]: I0307 06:52:50.248237 4941 scope.go:117] "RemoveContainer" containerID="f0f49605f28fd5785b836233ac0989fd87000c771ed5e7b8cea31121dd162d19" Mar 07 06:52:50 crc kubenswrapper[4941]: I0307 06:52:50.248453 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:50 crc kubenswrapper[4941]: I0307 06:52:50.249674 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:50 crc kubenswrapper[4941]: I0307 06:52:50.249707 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:50 crc kubenswrapper[4941]: I0307 06:52:50.249719 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:50 crc kubenswrapper[4941]: I0307 06:52:50.250477 4941 scope.go:117] "RemoveContainer" containerID="e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd" Mar 07 06:52:50 crc kubenswrapper[4941]: E0307 06:52:50.250691 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:52:51 crc kubenswrapper[4941]: I0307 06:52:51.253263 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 07 06:52:53 crc kubenswrapper[4941]: I0307 06:52:53.574977 4941 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 07 06:52:53 crc kubenswrapper[4941]: I0307 06:52:53.721111 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:52:53 crc kubenswrapper[4941]: I0307 06:52:53.721289 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:53 crc kubenswrapper[4941]: I0307 06:52:53.722506 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:53 crc kubenswrapper[4941]: I0307 06:52:53.722539 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:53 crc kubenswrapper[4941]: I0307 06:52:53.722547 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:53 crc kubenswrapper[4941]: I0307 06:52:53.723062 4941 scope.go:117] "RemoveContainer" containerID="e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd" Mar 07 06:52:53 crc kubenswrapper[4941]: E0307 06:52:53.723295 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:52:53 crc kubenswrapper[4941]: I0307 06:52:53.954193 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:53 crc kubenswrapper[4941]: I0307 06:52:53.955464 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:53 crc kubenswrapper[4941]: I0307 06:52:53.955504 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:53 crc kubenswrapper[4941]: I0307 06:52:53.955515 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.057054 4941 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.252117 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.264249 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.265297 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.265342 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.265353 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.266053 4941 scope.go:117] "RemoveContainer" containerID="e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd" Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.266225 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.282166 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.283738 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.283778 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.283793 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.283928 4941 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.290712 4941 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.291107 4941 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.291131 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.294300 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.294341 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.294351 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.294366 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.294375 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:52:54Z","lastTransitionTime":"2026-03-07T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.318975 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.327858 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.328197 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.328260 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.328331 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.328391 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:52:54Z","lastTransitionTime":"2026-03-07T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.349661 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.354449 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.354531 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.354552 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.354576 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.354595 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:52:54Z","lastTransitionTime":"2026-03-07T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.369292 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.373150 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.373186 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.373196 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.373214 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:52:54 crc kubenswrapper[4941]: I0307 06:52:54.373224 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:52:54Z","lastTransitionTime":"2026-03-07T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.383567 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.383684 4941 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.383717 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.484058 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.584169 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.684263 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.784450 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.884767 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:54 crc kubenswrapper[4941]: E0307 06:52:54.985567 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:55 crc kubenswrapper[4941]: E0307 06:52:55.086058 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:55 crc kubenswrapper[4941]: E0307 06:52:55.187195 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:55 crc kubenswrapper[4941]: E0307 06:52:55.287596 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:55 crc kubenswrapper[4941]: E0307 06:52:55.388471 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:55 crc kubenswrapper[4941]: E0307 06:52:55.489654 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:55 crc kubenswrapper[4941]: E0307 06:52:55.589785 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:55 crc kubenswrapper[4941]: E0307 06:52:55.690597 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:55 crc kubenswrapper[4941]: E0307 06:52:55.792009 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:55 crc kubenswrapper[4941]: E0307 06:52:55.892876 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:55 crc kubenswrapper[4941]: E0307 06:52:55.993009 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:56 crc kubenswrapper[4941]: E0307 06:52:56.093613 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:56 crc kubenswrapper[4941]: I0307 06:52:56.173584 4941 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 07 06:52:56 crc kubenswrapper[4941]: E0307 06:52:56.194451 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:56 crc kubenswrapper[4941]: E0307 06:52:56.294809 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:56 crc kubenswrapper[4941]: E0307 06:52:56.395685 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:56 crc kubenswrapper[4941]: E0307 06:52:56.496647 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:56 crc kubenswrapper[4941]: E0307 06:52:56.597115 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:56 crc kubenswrapper[4941]: E0307 06:52:56.697663 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:56 crc kubenswrapper[4941]: E0307 06:52:56.798752 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:56 crc kubenswrapper[4941]: E0307 06:52:56.899963 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:57 crc kubenswrapper[4941]: E0307 06:52:57.000108 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:57 crc kubenswrapper[4941]: E0307 06:52:57.100562 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:57 crc kubenswrapper[4941]: E0307 06:52:57.201965 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:57 crc kubenswrapper[4941]: E0307 06:52:57.302788 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:57 crc kubenswrapper[4941]: E0307 06:52:57.404374 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:57 crc kubenswrapper[4941]: E0307 06:52:57.505284 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:57 crc kubenswrapper[4941]: E0307 06:52:57.606358 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:57 crc kubenswrapper[4941]: E0307 06:52:57.708882 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:57 crc kubenswrapper[4941]: E0307 06:52:57.809600 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:57 crc kubenswrapper[4941]: E0307 06:52:57.910221 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:58 crc kubenswrapper[4941]: E0307 06:52:58.011231 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:58 crc kubenswrapper[4941]: E0307 06:52:58.111869 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:58 crc kubenswrapper[4941]: E0307 06:52:58.212953 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:58 crc kubenswrapper[4941]: E0307 06:52:58.314165 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:58 crc kubenswrapper[4941]: E0307 06:52:58.415255 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:58 crc kubenswrapper[4941]: E0307 06:52:58.515661 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:58 crc kubenswrapper[4941]: E0307 06:52:58.616130 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:58 crc kubenswrapper[4941]: E0307 06:52:58.716504 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:58 crc kubenswrapper[4941]: E0307 06:52:58.817356 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:58 crc kubenswrapper[4941]: E0307 06:52:58.917583 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:59 crc kubenswrapper[4941]: E0307 06:52:59.018462 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:59 crc kubenswrapper[4941]: E0307 06:52:59.118929 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:59 crc kubenswrapper[4941]: E0307 06:52:59.219394 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:59 crc kubenswrapper[4941]: E0307 06:52:59.320098 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:59 crc kubenswrapper[4941]: E0307 06:52:59.421685 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:59 crc kubenswrapper[4941]: E0307 06:52:59.522768 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:59 crc kubenswrapper[4941]: E0307 06:52:59.623132 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:59 crc kubenswrapper[4941]: E0307 06:52:59.724021 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:59 crc kubenswrapper[4941]: E0307 06:52:59.824880 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:52:59 crc kubenswrapper[4941]: E0307 06:52:59.925563 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:00 crc kubenswrapper[4941]: E0307 06:53:00.026353 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:00 crc kubenswrapper[4941]: E0307 06:53:00.127788 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:00 crc kubenswrapper[4941]: E0307 06:53:00.228967 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:00 crc kubenswrapper[4941]: E0307 06:53:00.329852 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:00 crc kubenswrapper[4941]: E0307 06:53:00.430546 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:00 crc kubenswrapper[4941]: E0307 06:53:00.530724 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:00 crc kubenswrapper[4941]: E0307 06:53:00.631782 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:00 crc kubenswrapper[4941]: E0307 06:53:00.732513 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:00 crc kubenswrapper[4941]: E0307 06:53:00.833079 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:00 crc kubenswrapper[4941]: E0307 06:53:00.933999 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:01 crc kubenswrapper[4941]: E0307 06:53:01.034573 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:01 crc kubenswrapper[4941]: E0307 06:53:01.134961 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:01 crc kubenswrapper[4941]: E0307 06:53:01.235518 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:01 crc kubenswrapper[4941]: E0307 06:53:01.336051 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:01 crc kubenswrapper[4941]: E0307 06:53:01.436817 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:01 crc kubenswrapper[4941]: E0307 06:53:01.537354 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:01 crc kubenswrapper[4941]: E0307 06:53:01.637844 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:01 crc kubenswrapper[4941]: E0307 06:53:01.738366 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:01 crc kubenswrapper[4941]: E0307 06:53:01.838598 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:01 crc kubenswrapper[4941]: E0307 06:53:01.938838 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:02 crc kubenswrapper[4941]: E0307 06:53:02.039623 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:02 crc kubenswrapper[4941]: E0307 06:53:02.140094 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:02 crc kubenswrapper[4941]: E0307 06:53:02.240351 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:02 crc kubenswrapper[4941]: E0307 06:53:02.340732 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:02 crc kubenswrapper[4941]: E0307 06:53:02.440929 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:02 crc kubenswrapper[4941]: E0307 06:53:02.541114 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:02 crc kubenswrapper[4941]: E0307 06:53:02.641535 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:02 crc kubenswrapper[4941]: E0307 06:53:02.742567 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:02 crc kubenswrapper[4941]: E0307 06:53:02.843640 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:02 crc kubenswrapper[4941]: E0307 06:53:02.944449 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:03 crc kubenswrapper[4941]: E0307 06:53:03.045601 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:03 crc kubenswrapper[4941]: E0307 06:53:03.146419 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:03 crc kubenswrapper[4941]: I0307 06:53:03.172025 4941 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 07 06:53:03 crc kubenswrapper[4941]: E0307 06:53:03.246968 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:03 crc kubenswrapper[4941]: E0307 06:53:03.347496 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:03 crc kubenswrapper[4941]: E0307 06:53:03.447710 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:03 crc kubenswrapper[4941]: E0307 06:53:03.548820 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:03 crc kubenswrapper[4941]: E0307 06:53:03.649106 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:03 crc kubenswrapper[4941]: E0307 06:53:03.749526 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:03 crc kubenswrapper[4941]: E0307 06:53:03.850554 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:03 crc kubenswrapper[4941]: E0307 06:53:03.951388 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.052280 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.057534 4941 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.152647 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.253773 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.354336 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.455367 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.493792 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.499362 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.499454 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.499481 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.499513 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.499539 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:04Z","lastTransitionTime":"2026-03-07T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.513225 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.517313 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.517348 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.517359 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.517375 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.517388 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:04Z","lastTransitionTime":"2026-03-07T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.529705 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.534442 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.534482 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.534494 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.534509 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.534519 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:04Z","lastTransitionTime":"2026-03-07T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.546980 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.551803 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.551860 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.551877 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.551906 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:04 crc kubenswrapper[4941]: I0307 06:53:04.551924 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:04Z","lastTransitionTime":"2026-03-07T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.564620 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.564783 4941 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.564817 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.665616 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.766393 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.867165 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:04 crc kubenswrapper[4941]: E0307 06:53:04.967800 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:05 crc kubenswrapper[4941]: E0307 06:53:05.068283 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:05 crc kubenswrapper[4941]: E0307 06:53:05.168978 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:05 crc kubenswrapper[4941]: E0307 06:53:05.269285 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:05 crc kubenswrapper[4941]: E0307 06:53:05.369889 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:05 crc kubenswrapper[4941]: E0307 06:53:05.470436 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:05 crc kubenswrapper[4941]: E0307 06:53:05.571540 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:05 crc kubenswrapper[4941]: E0307 06:53:05.672451 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:05 crc kubenswrapper[4941]: E0307 06:53:05.773074 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:05 crc kubenswrapper[4941]: E0307 06:53:05.873360 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:05 crc kubenswrapper[4941]: E0307 06:53:05.974505 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:06 crc kubenswrapper[4941]: E0307 06:53:06.075682 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:06 crc kubenswrapper[4941]: E0307 06:53:06.176486 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:06 crc kubenswrapper[4941]: E0307 06:53:06.276698 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:06 crc kubenswrapper[4941]: E0307 06:53:06.377152 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:06 crc kubenswrapper[4941]: E0307 06:53:06.478155 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:06 crc kubenswrapper[4941]: E0307 06:53:06.578837 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:06 crc kubenswrapper[4941]: E0307 06:53:06.679299 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:06 crc kubenswrapper[4941]: E0307 06:53:06.779547 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:06 crc kubenswrapper[4941]: E0307 06:53:06.880177 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:06 crc kubenswrapper[4941]: I0307 06:53:06.953741 4941 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:53:06 crc kubenswrapper[4941]: I0307 06:53:06.955263 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:06 crc kubenswrapper[4941]: I0307 06:53:06.955312 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:06 crc kubenswrapper[4941]: I0307 06:53:06.955331 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:06 crc kubenswrapper[4941]: I0307 06:53:06.956367 4941 scope.go:117] "RemoveContainer" containerID="e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd" Mar 07 06:53:06 crc kubenswrapper[4941]: E0307 06:53:06.956686 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:53:06 crc kubenswrapper[4941]: E0307 06:53:06.980716 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:07 crc kubenswrapper[4941]: E0307 06:53:07.081196 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:07 crc kubenswrapper[4941]: E0307 06:53:07.181665 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:07 crc kubenswrapper[4941]: E0307 06:53:07.282125 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:07 crc kubenswrapper[4941]: E0307 06:53:07.382831 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:07 crc kubenswrapper[4941]: E0307 06:53:07.483625 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:07 crc kubenswrapper[4941]: E0307 06:53:07.584292 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:07 crc kubenswrapper[4941]: E0307 06:53:07.684540 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:07 crc kubenswrapper[4941]: E0307 06:53:07.785187 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:07 crc kubenswrapper[4941]: E0307 06:53:07.886437 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:07 crc kubenswrapper[4941]: E0307 06:53:07.986926 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:08 crc kubenswrapper[4941]: E0307 06:53:08.087377 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:08 crc kubenswrapper[4941]: E0307 06:53:08.188267 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:08 crc kubenswrapper[4941]: E0307 06:53:08.289448 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:08 crc kubenswrapper[4941]: E0307 06:53:08.389921 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:08 crc kubenswrapper[4941]: E0307 06:53:08.490108 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:08 crc kubenswrapper[4941]: E0307 06:53:08.590288 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:08 crc kubenswrapper[4941]: E0307 06:53:08.691443 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:08 crc kubenswrapper[4941]: E0307 06:53:08.792497 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:08 crc kubenswrapper[4941]: E0307 06:53:08.892872 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:08 crc kubenswrapper[4941]: E0307 06:53:08.994781 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:09 crc kubenswrapper[4941]: E0307 06:53:09.095153 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:09 crc kubenswrapper[4941]: E0307 06:53:09.195670 4941 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.286749 4941 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.297745 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.297796 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.297814 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.297836 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.297853 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:09Z","lastTransitionTime":"2026-03-07T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.400630 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.400685 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.400697 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.400718 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.400730 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:09Z","lastTransitionTime":"2026-03-07T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.507518 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.507588 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.507609 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.507638 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.507667 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:09Z","lastTransitionTime":"2026-03-07T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.611511 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.611562 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.611579 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.611603 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.611621 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:09Z","lastTransitionTime":"2026-03-07T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.714877 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.714931 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.714947 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.714966 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.714979 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:09Z","lastTransitionTime":"2026-03-07T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.818570 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.818637 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.818657 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.818682 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.818701 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:09Z","lastTransitionTime":"2026-03-07T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.921295 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.921377 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.921439 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.921475 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.921499 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:09Z","lastTransitionTime":"2026-03-07T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.928507 4941 apiserver.go:52] "Watching apiserver" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.936293 4941 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.936934 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj","openshift-multus/multus-kc9rw","openshift-multus/network-metrics-daemon-q9fpr","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-image-registry/node-ca-vm6ql","openshift-machine-config-operator/machine-config-daemon-knkqz","openshift-multus/multus-additional-cni-plugins-q9xqh","openshift-ovn-kubernetes/ovnkube-node-x5ztp","openshift-dns/node-resolver-lv4jp","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.937432 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.937450 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 06:53:09 crc kubenswrapper[4941]: E0307 06:53:09.937515 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.937992 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:09 crc kubenswrapper[4941]: E0307 06:53:09.938045 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.938168 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.938267 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.938304 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:09 crc kubenswrapper[4941]: E0307 06:53:09.938363 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.938457 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vm6ql" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.939034 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.939050 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.939080 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.939095 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lv4jp" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.939160 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kc9rw" Mar 07 06:53:09 crc kubenswrapper[4941]: E0307 06:53:09.939241 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.939278 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.939671 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.940735 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.940815 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.941484 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.946626 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.946783 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.947056 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.947222 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.947302 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.947428 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.947743 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.947822 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.947890 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.947893 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.947900 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.948272 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.948537 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.948667 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.948916 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.949246 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.949451 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.949555 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.949757 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.949812 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.949926 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.949971 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.950205 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.950319 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.950479 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.950516 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.951152 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.951384 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.951579 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.951602 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.951776 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.951683 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.951727 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.952290 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.971092 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.983553 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:09 crc kubenswrapper[4941]: I0307 06:53:09.996193 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.007518 4941 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.010261 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.019325 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.023795 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.023840 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.023853 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.023873 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.023887 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:10Z","lastTransitionTime":"2026-03-07T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.034133 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.045769 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.056480 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064181 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064237 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064265 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064287 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064308 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064327 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064348 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064369 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064393 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064431 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064450 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064472 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064488 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064506 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064520 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064537 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064554 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064574 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064590 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064605 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064622 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064638 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064654 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064675 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064693 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064711 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064729 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064745 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064677 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064773 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064830 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064761 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064941 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065072 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065112 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065146 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065185 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065216 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065264 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065314 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065367 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065465 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065516 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065566 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065619 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065669 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065713 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065762 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.064854 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065095 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.067692 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065103 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065308 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065379 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065507 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.065712 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.066169 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.066363 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.066388 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.067970 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068039 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068091 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068149 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068209 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068259 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068300 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068335 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068390 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068477 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068534 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068588 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068640 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068769 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068887 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068942 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068997 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.069046 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.066463 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.066500 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.066514 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.066623 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.066753 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.066844 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.066897 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.067385 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.067488 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.067553 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.067578 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.067594 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.067603 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.067710 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.067793 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068025 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068669 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.068913 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.069159 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.069254 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.069466 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.069519 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.069731 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.069758 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.069785 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.069809 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.069999 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070031 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070059 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070175 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070202 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070222 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070251 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070273 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070293 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070312 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070369 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070389 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070428 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070451 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070481 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070509 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070530 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070553 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070576 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070597 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070633 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070668 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070698 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070722 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070817 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070833 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070856 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070874 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070904 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070932 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070967 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071002 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071031 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071053 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071073 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071098 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071116 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071132 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071150 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071168 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071188 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071206 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071227 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071253 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071286 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071312 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071338 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071360 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071379 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071418 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071453 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071475 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071499 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071522 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071546 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071572 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071634 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071661 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071689 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071717 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071772 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071827 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071858 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071887 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071915 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071944 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.072014 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.069842 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.072471 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.072507 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:53:10.572058756 +0000 UTC m=+87.524424421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.072650 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.072695 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.072765 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.072819 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.072839 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.072860 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.072970 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073001 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073361 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073176 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070429 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070712 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071099 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073648 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071281 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071391 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071597 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071680 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071817 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.071970 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.072154 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073241 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.070249 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073327 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073385 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073843 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073869 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073896 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073928 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073946 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073951 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073969 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.073979 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074117 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074169 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074207 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074248 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074293 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074304 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074373 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074389 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074439 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074467 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074514 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074551 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074584 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074618 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074652 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074686 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074724 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074758 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074794 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074828 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074864 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074902 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074938 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074971 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075008 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075040 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075078 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075126 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075176 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075226 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075277 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075330 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075382 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075487 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075541 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075576 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075610 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075647 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075680 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075714 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075752 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075790 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075833 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075867 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075902 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075941 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075983 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076033 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076081 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076128 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076177 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076316 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-cnibin\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076383 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-hostroot\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076478 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-openvswitch\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076736 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076785 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-ovnkube-script-lib\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076844 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076895 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3dd327c3-0368-47cd-87cb-d972354bedee-hosts-file\") pod \"node-resolver-lv4jp\" (UID: \"3dd327c3-0368-47cd-87cb-d972354bedee\") " pod="openshift-dns/node-resolver-lv4jp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076948 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf634026-9cb8-4afa-ad8c-e4f119f04899-cnibin\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076993 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf634026-9cb8-4afa-ad8c-e4f119f04899-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077040 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxsbb\" (UniqueName: \"kubernetes.io/projected/ed82bc0c-1609-449c-b2e2-2fe04af9749d-kube-api-access-rxsbb\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077084 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077130 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077166 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1cf1fc89-d66c-4cd5-b2ea-9537627bdf39-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-w48fj\" (UID: \"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077207 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-run-k8s-cni-cncf-io\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077240 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-kubelet\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077274 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-systemd\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077308 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/250d2c0d-993b-466a-a5e0-bacae5fe8df5-rootfs\") pod \"machine-config-daemon-knkqz\" (UID: \"250d2c0d-993b-466a-a5e0-bacae5fe8df5\") " pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077340 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-slash\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077379 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077457 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf634026-9cb8-4afa-ad8c-e4f119f04899-system-cni-dir\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077511 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qs9j\" (UniqueName: \"kubernetes.io/projected/bf634026-9cb8-4afa-ad8c-e4f119f04899-kube-api-access-6qs9j\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077546 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-system-cni-dir\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077582 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-run-netns\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077620 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-var-lib-cni-multus\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077869 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077994 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078036 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078077 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-etc-openvswitch\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078112 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078147 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078187 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078220 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3469f59-621c-4493-ade3-768772d05ebd-ovn-node-metrics-cert\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078258 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078298 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078380 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/250d2c0d-993b-466a-a5e0-bacae5fe8df5-mcd-auth-proxy-config\") pod \"machine-config-daemon-knkqz\" (UID: \"250d2c0d-993b-466a-a5e0-bacae5fe8df5\") " pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078484 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs886\" (UniqueName: \"kubernetes.io/projected/250d2c0d-993b-466a-a5e0-bacae5fe8df5-kube-api-access-gs886\") pod \"machine-config-daemon-knkqz\" (UID: \"250d2c0d-993b-466a-a5e0-bacae5fe8df5\") " pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078584 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-ovn\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078628 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078667 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf634026-9cb8-4afa-ad8c-e4f119f04899-cni-binary-copy\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078706 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed82bc0c-1609-449c-b2e2-2fe04af9749d-cni-binary-copy\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078737 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-log-socket\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078770 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-cni-bin\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078804 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh96x\" (UniqueName: \"kubernetes.io/projected/1cf1fc89-d66c-4cd5-b2ea-9537627bdf39-kube-api-access-dh96x\") pod \"ovnkube-control-plane-749d76644c-w48fj\" (UID: \"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078836 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-multus-conf-dir\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078868 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1cf1fc89-d66c-4cd5-b2ea-9537627bdf39-env-overrides\") pod \"ovnkube-control-plane-749d76644c-w48fj\" (UID: \"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078902 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-multus-cni-dir\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078935 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ed82bc0c-1609-449c-b2e2-2fe04af9749d-multus-daemon-config\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078973 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b8177e9-cc72-474e-95fa-b9d3539f4ad7-host\") pod \"node-ca-vm6ql\" (UID: \"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\") " pod="openshift-image-registry/node-ca-vm6ql" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079012 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-var-lib-kubelet\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079057 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-run-netns\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079098 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-cni-netd\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079146 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf634026-9cb8-4afa-ad8c-e4f119f04899-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079195 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/250d2c0d-993b-466a-a5e0-bacae5fe8df5-proxy-tls\") pod \"machine-config-daemon-knkqz\" (UID: \"250d2c0d-993b-466a-a5e0-bacae5fe8df5\") " pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079244 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sn24\" (UniqueName: \"kubernetes.io/projected/80030e60-caa3-4aad-8b00-10f5143d9243-kube-api-access-9sn24\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079289 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-os-release\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079342 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-multus-socket-dir-parent\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079438 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079492 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-var-lib-cni-bin\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079536 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-etc-kubernetes\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079606 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079645 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x2jb\" (UniqueName: \"kubernetes.io/projected/3dd327c3-0368-47cd-87cb-d972354bedee-kube-api-access-2x2jb\") pod \"node-resolver-lv4jp\" (UID: \"3dd327c3-0368-47cd-87cb-d972354bedee\") " pod="openshift-dns/node-resolver-lv4jp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079680 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcp6v\" (UniqueName: \"kubernetes.io/projected/c3469f59-621c-4493-ade3-768772d05ebd-kube-api-access-xcp6v\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079722 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079777 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf634026-9cb8-4afa-ad8c-e4f119f04899-os-release\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079826 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-run-multus-certs\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079874 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-var-lib-openvswitch\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079920 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-node-log\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079962 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5b8177e9-cc72-474e-95fa-b9d3539f4ad7-serviceca\") pod \"node-ca-vm6ql\" (UID: \"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\") " pod="openshift-image-registry/node-ca-vm6ql" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080005 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4x72\" (UniqueName: \"kubernetes.io/projected/5b8177e9-cc72-474e-95fa-b9d3539f4ad7-kube-api-access-h4x72\") pod \"node-ca-vm6ql\" (UID: \"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\") " pod="openshift-image-registry/node-ca-vm6ql" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080047 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-systemd-units\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080091 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-ovnkube-config\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080144 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-env-overrides\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080190 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1cf1fc89-d66c-4cd5-b2ea-9537627bdf39-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-w48fj\" (UID: \"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080347 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080383 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080445 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080482 4941 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080515 4941 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080548 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080583 4941 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080612 4941 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080643 4941 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080674 4941 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080705 4941 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080735 4941 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080767 4941 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080798 4941 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080827 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080857 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080887 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080918 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080951 4941 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080983 4941 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081014 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081044 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081076 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081104 4941 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081132 4941 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081161 4941 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081191 4941 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081229 4941 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081262 4941 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081292 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081321 4941 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081351 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081380 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081456 4941 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081488 4941 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081514 4941 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081543 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081571 4941 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081600 4941 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081634 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081663 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081694 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081723 4941 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081750 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081777 4941 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081807 4941 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081837 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081870 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081900 4941 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081930 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081959 4941 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081992 4941 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082024 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082053 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082081 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082111 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082139 4941 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082166 4941 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082197 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082231 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.083734 4941 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.083655 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.084453 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074468 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074481 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.085532 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074519 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074588 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074730 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.074765 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075079 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075082 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075084 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075109 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075252 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075555 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075865 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.075875 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076028 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.076132 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077538 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078055 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078100 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078112 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078260 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078296 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078571 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078593 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078839 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.078937 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079250 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079297 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079306 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079679 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079767 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079769 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079844 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079868 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079913 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.079833 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080121 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.077714 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080421 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.080936 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081803 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081887 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081877 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081803 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081923 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081928 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.081990 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082091 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082110 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082117 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082171 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082184 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082198 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082224 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082261 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.082740 4941 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.082785 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.083071 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.083357 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.083627 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.083697 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.083933 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.084160 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.084360 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.084343 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.084688 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.084817 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.085120 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.085247 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.086251 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.087070 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.087137 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.087291 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:10.587269233 +0000 UTC m=+87.539634698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.087319 4941 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.088034 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.088172 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.088733 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.088929 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.089188 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.089246 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.090548 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.091219 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.091824 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:10.591778657 +0000 UTC m=+87.544144222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.091963 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.091992 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.092412 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.092497 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.092818 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.092873 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.092903 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.093119 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.096092 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.097350 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.097671 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.097762 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.099920 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.100177 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.100753 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.101206 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.101693 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.101974 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.101994 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.102008 4941 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.102062 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:10.602045858 +0000 UTC m=+87.554411323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.102279 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.104768 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.104794 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.104804 4941 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.104847 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:10.604836849 +0000 UTC m=+87.557202314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.105316 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.105510 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.105676 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.105876 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.105908 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.106194 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.106350 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.106825 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.107293 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.109921 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.111860 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.111994 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.112036 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.112118 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.112203 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.112277 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.112308 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.112416 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.112551 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.112623 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.113189 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.113278 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.113394 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.113578 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.113822 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.113860 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.114048 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.114060 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.114302 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.114325 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.114822 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.115210 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.115628 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.117941 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.121252 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.121282 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.121772 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.121859 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.122084 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.122375 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.128129 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.128177 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.128186 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.128206 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.128218 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:10Z","lastTransitionTime":"2026-03-07T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.130210 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.131725 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.141061 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.141281 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.150978 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.153458 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.154991 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.162808 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.176560 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.182945 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.182982 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-etc-openvswitch\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.182999 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183023 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3469f59-621c-4493-ade3-768772d05ebd-ovn-node-metrics-cert\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183038 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183055 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/250d2c0d-993b-466a-a5e0-bacae5fe8df5-mcd-auth-proxy-config\") pod \"machine-config-daemon-knkqz\" (UID: \"250d2c0d-993b-466a-a5e0-bacae5fe8df5\") " pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183070 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs886\" (UniqueName: \"kubernetes.io/projected/250d2c0d-993b-466a-a5e0-bacae5fe8df5-kube-api-access-gs886\") pod \"machine-config-daemon-knkqz\" (UID: \"250d2c0d-993b-466a-a5e0-bacae5fe8df5\") " pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183086 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-ovn\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.183093 4941 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183129 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-etc-openvswitch\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.183162 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs podName:80030e60-caa3-4aad-8b00-10f5143d9243 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:10.68314452 +0000 UTC m=+87.635509985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs") pod "network-metrics-daemon-q9fpr" (UID: "80030e60-caa3-4aad-8b00-10f5143d9243") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183100 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183251 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183250 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf634026-9cb8-4afa-ad8c-e4f119f04899-cni-binary-copy\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183285 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed82bc0c-1609-449c-b2e2-2fe04af9749d-cni-binary-copy\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183308 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-log-socket\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183331 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-cni-bin\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183354 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-multus-conf-dir\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183376 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh96x\" (UniqueName: \"kubernetes.io/projected/1cf1fc89-d66c-4cd5-b2ea-9537627bdf39-kube-api-access-dh96x\") pod \"ovnkube-control-plane-749d76644c-w48fj\" (UID: \"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183397 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183431 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-multus-cni-dir\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183459 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ed82bc0c-1609-449c-b2e2-2fe04af9749d-multus-daemon-config\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183479 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1cf1fc89-d66c-4cd5-b2ea-9537627bdf39-env-overrides\") pod \"ovnkube-control-plane-749d76644c-w48fj\" (UID: \"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183555 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-var-lib-kubelet\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183582 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b8177e9-cc72-474e-95fa-b9d3539f4ad7-host\") pod \"node-ca-vm6ql\" (UID: \"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\") " pod="openshift-image-registry/node-ca-vm6ql" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183619 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-multus-conf-dir\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183637 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf634026-9cb8-4afa-ad8c-e4f119f04899-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183661 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/250d2c0d-993b-466a-a5e0-bacae5fe8df5-proxy-tls\") pod \"machine-config-daemon-knkqz\" (UID: \"250d2c0d-993b-466a-a5e0-bacae5fe8df5\") " pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183708 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-run-netns\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183729 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-cni-netd\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183790 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sn24\" (UniqueName: \"kubernetes.io/projected/80030e60-caa3-4aad-8b00-10f5143d9243-kube-api-access-9sn24\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183813 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-os-release\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183289 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183835 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-multus-socket-dir-parent\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183889 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-etc-kubernetes\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.183968 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x2jb\" (UniqueName: \"kubernetes.io/projected/3dd327c3-0368-47cd-87cb-d972354bedee-kube-api-access-2x2jb\") pod \"node-resolver-lv4jp\" (UID: \"3dd327c3-0368-47cd-87cb-d972354bedee\") " pod="openshift-dns/node-resolver-lv4jp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184002 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-var-lib-cni-bin\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184087 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-ovn\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184069 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf634026-9cb8-4afa-ad8c-e4f119f04899-os-release\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184209 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/250d2c0d-993b-466a-a5e0-bacae5fe8df5-mcd-auth-proxy-config\") pod \"machine-config-daemon-knkqz\" (UID: \"250d2c0d-993b-466a-a5e0-bacae5fe8df5\") " pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184306 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-log-socket\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184351 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-var-lib-kubelet\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184400 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-cni-bin\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184345 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b8177e9-cc72-474e-95fa-b9d3539f4ad7-host\") pod \"node-ca-vm6ql\" (UID: \"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\") " pod="openshift-image-registry/node-ca-vm6ql" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184368 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf634026-9cb8-4afa-ad8c-e4f119f04899-cni-binary-copy\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184467 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-etc-kubernetes\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184567 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-multus-cni-dir\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184632 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-cni-netd\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184667 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed82bc0c-1609-449c-b2e2-2fe04af9749d-cni-binary-copy\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184684 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-run-netns\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184762 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-var-lib-cni-bin\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184777 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-os-release\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184799 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf634026-9cb8-4afa-ad8c-e4f119f04899-os-release\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184852 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1cf1fc89-d66c-4cd5-b2ea-9537627bdf39-env-overrides\") pod \"ovnkube-control-plane-749d76644c-w48fj\" (UID: \"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184903 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-run-multus-certs\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184939 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-run-multus-certs\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.184987 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-var-lib-openvswitch\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185024 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-node-log\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185056 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcp6v\" (UniqueName: \"kubernetes.io/projected/c3469f59-621c-4493-ade3-768772d05ebd-kube-api-access-xcp6v\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185106 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ed82bc0c-1609-449c-b2e2-2fe04af9749d-multus-daemon-config\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185141 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-var-lib-openvswitch\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185129 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4x72\" (UniqueName: \"kubernetes.io/projected/5b8177e9-cc72-474e-95fa-b9d3539f4ad7-kube-api-access-h4x72\") pod \"node-ca-vm6ql\" (UID: \"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\") " pod="openshift-image-registry/node-ca-vm6ql" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185202 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-node-log\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185319 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-systemd-units\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185456 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-ovnkube-config\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185496 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-env-overrides\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185525 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5b8177e9-cc72-474e-95fa-b9d3539f4ad7-serviceca\") pod \"node-ca-vm6ql\" (UID: \"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\") " pod="openshift-image-registry/node-ca-vm6ql" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185550 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1cf1fc89-d66c-4cd5-b2ea-9537627bdf39-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-w48fj\" (UID: \"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185578 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-cnibin\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185603 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-hostroot\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185629 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-openvswitch\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185658 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf634026-9cb8-4afa-ad8c-e4f119f04899-cnibin\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185680 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf634026-9cb8-4afa-ad8c-e4f119f04899-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185707 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxsbb\" (UniqueName: \"kubernetes.io/projected/ed82bc0c-1609-449c-b2e2-2fe04af9749d-kube-api-access-rxsbb\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185733 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185759 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-ovnkube-script-lib\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185795 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3dd327c3-0368-47cd-87cb-d972354bedee-hosts-file\") pod \"node-resolver-lv4jp\" (UID: \"3dd327c3-0368-47cd-87cb-d972354bedee\") " pod="openshift-dns/node-resolver-lv4jp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185819 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1cf1fc89-d66c-4cd5-b2ea-9537627bdf39-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-w48fj\" (UID: \"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185863 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-run-k8s-cni-cncf-io\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185890 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-kubelet\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185913 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-systemd\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185939 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qs9j\" (UniqueName: \"kubernetes.io/projected/bf634026-9cb8-4afa-ad8c-e4f119f04899-kube-api-access-6qs9j\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185964 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-system-cni-dir\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185987 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-run-netns\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186012 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-var-lib-cni-multus\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186028 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-env-overrides\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186077 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/250d2c0d-993b-466a-a5e0-bacae5fe8df5-rootfs\") pod \"machine-config-daemon-knkqz\" (UID: \"250d2c0d-993b-466a-a5e0-bacae5fe8df5\") " pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186034 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/250d2c0d-993b-466a-a5e0-bacae5fe8df5-rootfs\") pod \"machine-config-daemon-knkqz\" (UID: \"250d2c0d-993b-466a-a5e0-bacae5fe8df5\") " pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186123 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-slash\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186160 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf634026-9cb8-4afa-ad8c-e4f119f04899-system-cni-dir\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186183 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3dd327c3-0368-47cd-87cb-d972354bedee-hosts-file\") pod \"node-resolver-lv4jp\" (UID: \"3dd327c3-0368-47cd-87cb-d972354bedee\") " pod="openshift-dns/node-resolver-lv4jp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186258 4941 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186272 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186282 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186291 4941 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186300 4941 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186309 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186321 4941 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186330 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186331 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf634026-9cb8-4afa-ad8c-e4f119f04899-cnibin\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186308 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-slash\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186340 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186371 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186379 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-systemd\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186388 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf634026-9cb8-4afa-ad8c-e4f119f04899-system-cni-dir\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186421 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186437 4941 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186669 4941 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186680 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186691 4941 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186757 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5b8177e9-cc72-474e-95fa-b9d3539f4ad7-serviceca\") pod \"node-ca-vm6ql\" (UID: \"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\") " pod="openshift-image-registry/node-ca-vm6ql" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186832 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-cnibin\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186744 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186873 4941 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186888 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186900 4941 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186914 4941 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186927 4941 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186941 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186955 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.186966 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.187086 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-hostroot\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.187124 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-openvswitch\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.185530 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-systemd-units\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.187279 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-multus-socket-dir-parent\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.187326 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.187347 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-run-k8s-cni-cncf-io\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.187423 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-run-netns\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.187478 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-host-var-lib-cni-multus\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.187708 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-kubelet\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.187490 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed82bc0c-1609-449c-b2e2-2fe04af9749d-system-cni-dir\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.187487 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.188702 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.188803 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf634026-9cb8-4afa-ad8c-e4f119f04899-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.188858 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.188885 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.189372 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-ovnkube-script-lib\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.190698 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1cf1fc89-d66c-4cd5-b2ea-9537627bdf39-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-w48fj\" (UID: \"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.191813 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/250d2c0d-993b-466a-a5e0-bacae5fe8df5-proxy-tls\") pod \"machine-config-daemon-knkqz\" (UID: \"250d2c0d-993b-466a-a5e0-bacae5fe8df5\") " pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192165 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-ovnkube-config\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192275 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192300 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192315 4941 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192327 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192339 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192352 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192365 4941 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192378 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192389 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192425 4941 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192437 4941 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192448 4941 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192461 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192474 4941 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192491 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192506 4941 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192518 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192528 4941 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192540 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192551 4941 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192563 4941 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192573 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192585 4941 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192596 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192608 4941 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192619 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192629 4941 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192641 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192655 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192669 4941 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192683 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192695 4941 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192705 4941 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192716 4941 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192729 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192741 4941 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192776 4941 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192789 4941 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192800 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192811 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192820 4941 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192831 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192840 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192850 4941 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192860 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192870 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192880 4941 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192893 4941 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192904 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192915 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192926 4941 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192936 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192946 4941 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192956 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192967 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192979 4941 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.192988 4941 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193001 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193011 4941 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193021 4941 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193030 4941 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193040 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193051 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193060 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193069 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193079 4941 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193088 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193098 4941 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193106 4941 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193115 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193127 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193137 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193147 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193159 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193172 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193185 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193196 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193205 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193215 4941 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193226 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193239 4941 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193249 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193260 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193271 4941 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193280 4941 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193289 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193298 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193309 4941 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193319 4941 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193329 4941 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193338 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193347 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193358 4941 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193367 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193380 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193392 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193417 4941 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193427 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193438 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193448 4941 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193459 4941 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193469 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193482 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193492 4941 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193503 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.193514 4941 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.195747 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf634026-9cb8-4afa-ad8c-e4f119f04899-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.198060 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1cf1fc89-d66c-4cd5-b2ea-9537627bdf39-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-w48fj\" (UID: \"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.201573 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3469f59-621c-4493-ade3-768772d05ebd-ovn-node-metrics-cert\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.207387 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh96x\" (UniqueName: \"kubernetes.io/projected/1cf1fc89-d66c-4cd5-b2ea-9537627bdf39-kube-api-access-dh96x\") pod \"ovnkube-control-plane-749d76644c-w48fj\" (UID: \"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.207385 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs886\" (UniqueName: \"kubernetes.io/projected/250d2c0d-993b-466a-a5e0-bacae5fe8df5-kube-api-access-gs886\") pod \"machine-config-daemon-knkqz\" (UID: \"250d2c0d-993b-466a-a5e0-bacae5fe8df5\") " pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.207530 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sn24\" (UniqueName: \"kubernetes.io/projected/80030e60-caa3-4aad-8b00-10f5143d9243-kube-api-access-9sn24\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.207923 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxsbb\" (UniqueName: \"kubernetes.io/projected/ed82bc0c-1609-449c-b2e2-2fe04af9749d-kube-api-access-rxsbb\") pod \"multus-kc9rw\" (UID: \"ed82bc0c-1609-449c-b2e2-2fe04af9749d\") " pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.207983 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcp6v\" (UniqueName: \"kubernetes.io/projected/c3469f59-621c-4493-ade3-768772d05ebd-kube-api-access-xcp6v\") pod \"ovnkube-node-x5ztp\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.208834 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4x72\" (UniqueName: \"kubernetes.io/projected/5b8177e9-cc72-474e-95fa-b9d3539f4ad7-kube-api-access-h4x72\") pod \"node-ca-vm6ql\" (UID: \"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\") " pod="openshift-image-registry/node-ca-vm6ql" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.209883 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x2jb\" (UniqueName: \"kubernetes.io/projected/3dd327c3-0368-47cd-87cb-d972354bedee-kube-api-access-2x2jb\") pod \"node-resolver-lv4jp\" (UID: \"3dd327c3-0368-47cd-87cb-d972354bedee\") " pod="openshift-dns/node-resolver-lv4jp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.221663 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qs9j\" (UniqueName: \"kubernetes.io/projected/bf634026-9cb8-4afa-ad8c-e4f119f04899-kube-api-access-6qs9j\") pod \"multus-additional-cni-plugins-q9xqh\" (UID: \"bf634026-9cb8-4afa-ad8c-e4f119f04899\") " pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.231296 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.231333 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.231342 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.231358 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.231368 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:10Z","lastTransitionTime":"2026-03-07T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.257267 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.265611 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.274174 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 06:53:10 crc kubenswrapper[4941]: W0307 06:53:10.278036 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-216bd116477968d356de6d7938c6c6e0ce08038071b214516aa7cdcef731d5e7 WatchSource:0}: Error finding container 216bd116477968d356de6d7938c6c6e0ce08038071b214516aa7cdcef731d5e7: Status 404 returned error can't find the container with id 216bd116477968d356de6d7938c6c6e0ce08038071b214516aa7cdcef731d5e7 Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.284070 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vm6ql" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.291001 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" Mar 07 06:53:10 crc kubenswrapper[4941]: W0307 06:53:10.296959 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-9a1269627f89c98993edb590af7b24f603b9d78d21b57a36b7cd258762f55fd1 WatchSource:0}: Error finding container 9a1269627f89c98993edb590af7b24f603b9d78d21b57a36b7cd258762f55fd1: Status 404 returned error can't find the container with id 9a1269627f89c98993edb590af7b24f603b9d78d21b57a36b7cd258762f55fd1 Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.300051 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.312911 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.319389 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9a1269627f89c98993edb590af7b24f603b9d78d21b57a36b7cd258762f55fd1"} Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.320796 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"216bd116477968d356de6d7938c6c6e0ce08038071b214516aa7cdcef731d5e7"} Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.321128 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.322056 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"78a132889dbe0b74a25b5e6f4dad72e4a740cc4ea562fa75473a2b7790e27667"} Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.330179 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kc9rw" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.334895 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.334929 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.334939 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.334957 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.334971 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:10Z","lastTransitionTime":"2026-03-07T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.336659 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lv4jp" Mar 07 06:53:10 crc kubenswrapper[4941]: W0307 06:53:10.338649 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3469f59_621c_4493_ade3_768772d05ebd.slice/crio-5e9b053798d2692856ad88ba678cdc393c3bd826be5094ccd0ebad12f91e397e WatchSource:0}: Error finding container 5e9b053798d2692856ad88ba678cdc393c3bd826be5094ccd0ebad12f91e397e: Status 404 returned error can't find the container with id 5e9b053798d2692856ad88ba678cdc393c3bd826be5094ccd0ebad12f91e397e Mar 07 06:53:10 crc kubenswrapper[4941]: W0307 06:53:10.355780 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250d2c0d_993b_466a_a5e0_bacae5fe8df5.slice/crio-faedc53bdf00d219528f8798c2a10bca627abb068925da85096e3eda2cdef353 WatchSource:0}: Error finding container faedc53bdf00d219528f8798c2a10bca627abb068925da85096e3eda2cdef353: Status 404 returned error can't find the container with id faedc53bdf00d219528f8798c2a10bca627abb068925da85096e3eda2cdef353 Mar 07 06:53:10 crc kubenswrapper[4941]: W0307 06:53:10.357909 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cf1fc89_d66c_4cd5_b2ea_9537627bdf39.slice/crio-3e8f451b6cfd726f760d8af2f2baa23b75e9a4ed52b37dff58a7ebf8986aee0d WatchSource:0}: Error finding container 3e8f451b6cfd726f760d8af2f2baa23b75e9a4ed52b37dff58a7ebf8986aee0d: Status 404 returned error can't find the container with id 3e8f451b6cfd726f760d8af2f2baa23b75e9a4ed52b37dff58a7ebf8986aee0d Mar 07 06:53:10 crc kubenswrapper[4941]: W0307 06:53:10.374805 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd327c3_0368_47cd_87cb_d972354bedee.slice/crio-b141c3ef26fe41d96baf35ffa4eac32490d1948f92498749d5e96140eb3a4e0b WatchSource:0}: Error finding container b141c3ef26fe41d96baf35ffa4eac32490d1948f92498749d5e96140eb3a4e0b: Status 404 returned error can't find the container with id b141c3ef26fe41d96baf35ffa4eac32490d1948f92498749d5e96140eb3a4e0b Mar 07 06:53:10 crc kubenswrapper[4941]: W0307 06:53:10.389753 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded82bc0c_1609_449c_b2e2_2fe04af9749d.slice/crio-7fd67870d0bd2698a52616e1fc25c64a5b62b29cecb237e4060f173353a63299 WatchSource:0}: Error finding container 7fd67870d0bd2698a52616e1fc25c64a5b62b29cecb237e4060f173353a63299: Status 404 returned error can't find the container with id 7fd67870d0bd2698a52616e1fc25c64a5b62b29cecb237e4060f173353a63299 Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.437994 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.438049 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.438067 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.438089 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.438104 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:10Z","lastTransitionTime":"2026-03-07T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.541126 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.541161 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.541172 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.541188 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.541201 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:10Z","lastTransitionTime":"2026-03-07T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.596671 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.596785 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.596838 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.596943 4941 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.596982 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:11.596967974 +0000 UTC m=+88.549333439 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.597230 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:53:11.59722282 +0000 UTC m=+88.549588285 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.597366 4941 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.597477 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:11.597453796 +0000 UTC m=+88.549819291 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.643102 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.643153 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.643164 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.643181 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.643191 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:10Z","lastTransitionTime":"2026-03-07T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.697731 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.697783 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.697846 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.697969 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.697986 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.697997 4941 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.698041 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:11.698028763 +0000 UTC m=+88.650394228 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.698092 4941 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.698127 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.698166 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.698137 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs podName:80030e60-caa3-4aad-8b00-10f5143d9243 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:11.698127886 +0000 UTC m=+88.650493341 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs") pod "network-metrics-daemon-q9fpr" (UID: "80030e60-caa3-4aad-8b00-10f5143d9243") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.698177 4941 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.698241 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:11.698219938 +0000 UTC m=+88.650585443 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.748345 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.748386 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.748398 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.748433 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.748445 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:10Z","lastTransitionTime":"2026-03-07T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.851134 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.851183 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.851197 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.851214 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.851225 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:10Z","lastTransitionTime":"2026-03-07T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.953252 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.953967 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.953987 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.954005 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.954017 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:10Z","lastTransitionTime":"2026-03-07T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:10 crc kubenswrapper[4941]: I0307 06:53:10.954214 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:10 crc kubenswrapper[4941]: E0307 06:53:10.954344 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.056371 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.056434 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.056444 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.056459 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.056471 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:11Z","lastTransitionTime":"2026-03-07T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.160301 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.160341 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.160352 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.160369 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.160381 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:11Z","lastTransitionTime":"2026-03-07T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.263482 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.263540 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.263558 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.263582 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.263599 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:11Z","lastTransitionTime":"2026-03-07T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.326598 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.328759 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kc9rw" event={"ID":"ed82bc0c-1609-449c-b2e2-2fe04af9749d","Type":"ContainerStarted","Data":"9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.328784 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kc9rw" event={"ID":"ed82bc0c-1609-449c-b2e2-2fe04af9749d","Type":"ContainerStarted","Data":"7fd67870d0bd2698a52616e1fc25c64a5b62b29cecb237e4060f173353a63299"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.330739 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.330800 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.330814 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"faedc53bdf00d219528f8798c2a10bca627abb068925da85096e3eda2cdef353"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.331828 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vm6ql" event={"ID":"5b8177e9-cc72-474e-95fa-b9d3539f4ad7","Type":"ContainerStarted","Data":"64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.331862 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vm6ql" event={"ID":"5b8177e9-cc72-474e-95fa-b9d3539f4ad7","Type":"ContainerStarted","Data":"8ee9c684cdbcb5bb1217116f91cc3603c8ebbd4dc802ba0e3890235085bac6c7"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.333685 4941 generic.go:334] "Generic (PLEG): container finished" podID="bf634026-9cb8-4afa-ad8c-e4f119f04899" containerID="86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955" exitCode=0 Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.333757 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" event={"ID":"bf634026-9cb8-4afa-ad8c-e4f119f04899","Type":"ContainerDied","Data":"86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.333779 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" event={"ID":"bf634026-9cb8-4afa-ad8c-e4f119f04899","Type":"ContainerStarted","Data":"21bf313f5e7fd58fd8a6ee889c870b832fc9d9187380b69e6dad94522a0965e2"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.337304 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lv4jp" event={"ID":"3dd327c3-0368-47cd-87cb-d972354bedee","Type":"ContainerStarted","Data":"29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.337343 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lv4jp" event={"ID":"3dd327c3-0368-47cd-87cb-d972354bedee","Type":"ContainerStarted","Data":"b141c3ef26fe41d96baf35ffa4eac32490d1948f92498749d5e96140eb3a4e0b"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.339395 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.339440 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.344709 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3469f59-621c-4493-ade3-768772d05ebd" containerID="3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df" exitCode=0 Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.344832 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.345971 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerStarted","Data":"5e9b053798d2692856ad88ba678cdc393c3bd826be5094ccd0ebad12f91e397e"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.347466 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.349100 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" event={"ID":"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39","Type":"ContainerStarted","Data":"2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.349128 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" event={"ID":"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39","Type":"ContainerStarted","Data":"fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.349158 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" event={"ID":"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39","Type":"ContainerStarted","Data":"3e8f451b6cfd726f760d8af2f2baa23b75e9a4ed52b37dff58a7ebf8986aee0d"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.369137 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.369221 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.369233 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.369253 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.369267 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:11Z","lastTransitionTime":"2026-03-07T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.369158 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.382886 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.396368 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.411880 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.432331 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.444543 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.456318 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.470894 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.472944 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.473018 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.473032 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.473051 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.473090 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:11Z","lastTransitionTime":"2026-03-07T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.484328 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.497863 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.515886 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.531437 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.553814 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.569577 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.582348 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.582396 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.582423 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.582441 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.582451 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:11Z","lastTransitionTime":"2026-03-07T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.587692 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.599353 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.607322 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.607487 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:53:13.607457169 +0000 UTC m=+90.559822634 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.607599 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.607670 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.607762 4941 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.607787 4941 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.607825 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:13.607808958 +0000 UTC m=+90.560174423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.607849 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:13.607840099 +0000 UTC m=+90.560205564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.615004 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.627597 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.643476 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.656228 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.670529 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.686112 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.686166 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.686178 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.686203 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.686219 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:11Z","lastTransitionTime":"2026-03-07T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.695002 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.708542 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.708606 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.708635 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.708801 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.708854 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.708851 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.708896 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.708910 4941 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.709008 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:13.708985401 +0000 UTC m=+90.661351026 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.708874 4941 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.709445 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:13.709436612 +0000 UTC m=+90.661802077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.708814 4941 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.709490 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs podName:80030e60-caa3-4aad-8b00-10f5143d9243 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:13.709482354 +0000 UTC m=+90.661847819 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs") pod "network-metrics-daemon-q9fpr" (UID: "80030e60-caa3-4aad-8b00-10f5143d9243") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.710129 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.728091 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.743375 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.761919 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.780770 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:11Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.788610 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.788648 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.788657 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.788673 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.788685 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:11Z","lastTransitionTime":"2026-03-07T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.891905 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.892326 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.892345 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.892363 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.892389 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:11Z","lastTransitionTime":"2026-03-07T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.954473 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.954498 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.954473 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.954676 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.954599 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:11 crc kubenswrapper[4941]: E0307 06:53:11.954842 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.961109 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.962054 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.963504 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.964924 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.966143 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.966694 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.967375 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.969148 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.970416 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.975664 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.976455 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.977757 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.978357 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.978967 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.979950 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.980548 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.981630 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.982092 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.982715 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.983935 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.984680 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.986101 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.986603 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.987877 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.988774 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.989542 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.990679 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.991194 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.993271 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.994003 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.994575 4941 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.995108 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.996543 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.996596 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.996610 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.996630 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.996644 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:11Z","lastTransitionTime":"2026-03-07T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.996908 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.997565 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 07 06:53:11 crc kubenswrapper[4941]: I0307 06:53:11.998734 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.000209 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.000886 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.001852 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.002532 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.003588 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.004131 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.005166 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.005942 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.006923 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.007457 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.008431 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.009024 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.010310 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.010828 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.011780 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.012247 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.012803 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.013785 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.014246 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.099848 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.099899 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.099909 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.099927 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.099938 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:12Z","lastTransitionTime":"2026-03-07T06:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.203292 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.203364 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.203376 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.203396 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.203448 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:12Z","lastTransitionTime":"2026-03-07T06:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.306367 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.306442 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.306462 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.306483 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.306496 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:12Z","lastTransitionTime":"2026-03-07T06:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.357043 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerStarted","Data":"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.357294 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerStarted","Data":"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.357320 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerStarted","Data":"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.357332 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerStarted","Data":"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.359137 4941 generic.go:334] "Generic (PLEG): container finished" podID="bf634026-9cb8-4afa-ad8c-e4f119f04899" containerID="0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159" exitCode=0 Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.359242 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" event={"ID":"bf634026-9cb8-4afa-ad8c-e4f119f04899","Type":"ContainerDied","Data":"0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.383878 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.400278 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.409780 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.409837 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.409851 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.409873 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.409885 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:12Z","lastTransitionTime":"2026-03-07T06:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.415438 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.432527 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.449238 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.463549 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.477029 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.497386 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.512743 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.512792 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.512807 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.512826 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.512846 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:12Z","lastTransitionTime":"2026-03-07T06:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.512837 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.529048 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.546196 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.563700 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.578607 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.592306 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:12Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.617899 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.617952 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.617965 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.617986 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.618003 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:12Z","lastTransitionTime":"2026-03-07T06:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.720275 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.720336 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.720350 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.720374 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.720388 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:12Z","lastTransitionTime":"2026-03-07T06:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.823819 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.823865 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.823877 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.823891 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.823904 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:12Z","lastTransitionTime":"2026-03-07T06:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.926091 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.926148 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.926166 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.926185 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.926197 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:12Z","lastTransitionTime":"2026-03-07T06:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:12 crc kubenswrapper[4941]: I0307 06:53:12.954616 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:12 crc kubenswrapper[4941]: E0307 06:53:12.954810 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.029394 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.029466 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.029521 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.029554 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.029568 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:13Z","lastTransitionTime":"2026-03-07T06:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.138346 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.138398 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.138464 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.138491 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.138503 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:13Z","lastTransitionTime":"2026-03-07T06:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.242881 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.243269 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.244017 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.244061 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.244093 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:13Z","lastTransitionTime":"2026-03-07T06:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.347031 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.347082 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.347108 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.347132 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.347147 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:13Z","lastTransitionTime":"2026-03-07T06:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.365964 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerStarted","Data":"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.366262 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerStarted","Data":"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.368039 4941 generic.go:334] "Generic (PLEG): container finished" podID="bf634026-9cb8-4afa-ad8c-e4f119f04899" containerID="5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562" exitCode=0 Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.368100 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" event={"ID":"bf634026-9cb8-4afa-ad8c-e4f119f04899","Type":"ContainerDied","Data":"5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.369270 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.381972 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.400903 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.416388 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.428461 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.440951 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.450220 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.450252 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.450262 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.450278 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.450291 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:13Z","lastTransitionTime":"2026-03-07T06:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.455854 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.471080 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.485861 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.501182 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.514555 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.529949 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.541359 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.553032 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.553078 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.553087 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.553108 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.553119 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:13Z","lastTransitionTime":"2026-03-07T06:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.559912 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.571087 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.586225 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.599739 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.619050 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.630790 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.630882 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.630926 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:53:17.630906442 +0000 UTC m=+94.583271907 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.630952 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.630990 4941 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.631026 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:17.631017905 +0000 UTC m=+94.583383370 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.631039 4941 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.631062 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:17.631056215 +0000 UTC m=+94.583421680 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.632070 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.642916 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.655223 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.656164 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.656294 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.656391 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.656509 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.656598 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:13Z","lastTransitionTime":"2026-03-07T06:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.669053 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.685954 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.700828 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.714974 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.728942 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.731598 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.731641 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.731716 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.731890 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.731893 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.731943 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.731957 4941 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.731913 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.732087 4941 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.732019 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:17.732000582 +0000 UTC m=+94.684366047 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.732029 4941 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.732156 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs podName:80030e60-caa3-4aad-8b00-10f5143d9243 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:17.732146456 +0000 UTC m=+94.684511921 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs") pod "network-metrics-daemon-q9fpr" (UID: "80030e60-caa3-4aad-8b00-10f5143d9243") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.732191 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:17.732164577 +0000 UTC m=+94.684530052 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.740307 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.759242 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.759472 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.759506 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.759517 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.759533 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.759543 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:13Z","lastTransitionTime":"2026-03-07T06:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.772298 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.862296 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.862349 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.862364 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.862383 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.862396 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:13Z","lastTransitionTime":"2026-03-07T06:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.953994 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.954098 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.954184 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.954190 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.954273 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:13 crc kubenswrapper[4941]: E0307 06:53:13.954513 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.964635 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.964710 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.964724 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.964744 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.964769 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:13Z","lastTransitionTime":"2026-03-07T06:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.977389 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:13 crc kubenswrapper[4941]: I0307 06:53:13.994989 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.015987 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.044042 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.066145 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.066827 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.066991 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.067239 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.067264 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.067279 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:14Z","lastTransitionTime":"2026-03-07T06:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.087524 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.100103 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.112940 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.123258 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.135499 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.151754 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.163175 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.169590 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.169637 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.169649 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.169685 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.169697 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:14Z","lastTransitionTime":"2026-03-07T06:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.177481 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.189486 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.272565 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.272628 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.272642 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.272662 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.272680 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:14Z","lastTransitionTime":"2026-03-07T06:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.374706 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.374760 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.374775 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.374793 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.374806 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:14Z","lastTransitionTime":"2026-03-07T06:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.377004 4941 generic.go:334] "Generic (PLEG): container finished" podID="bf634026-9cb8-4afa-ad8c-e4f119f04899" containerID="83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a" exitCode=0 Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.377096 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" event={"ID":"bf634026-9cb8-4afa-ad8c-e4f119f04899","Type":"ContainerDied","Data":"83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a"} Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.395023 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.415693 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.436378 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.450299 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.462329 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.477825 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.477892 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.477908 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.477929 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.477944 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:14Z","lastTransitionTime":"2026-03-07T06:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.479519 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.495134 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.508954 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.522508 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.536947 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.550097 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.561509 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.577044 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.581010 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.581049 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.581060 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.581077 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.581093 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:14Z","lastTransitionTime":"2026-03-07T06:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.591079 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.683646 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.683683 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.683692 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.683708 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.683717 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:14Z","lastTransitionTime":"2026-03-07T06:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.786509 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.786552 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.786563 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.786584 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.786599 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:14Z","lastTransitionTime":"2026-03-07T06:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.889792 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.889877 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.889893 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.889915 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.889933 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:14Z","lastTransitionTime":"2026-03-07T06:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.919839 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.919933 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.919953 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.920011 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.920032 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:14Z","lastTransitionTime":"2026-03-07T06:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:14 crc kubenswrapper[4941]: E0307 06:53:14.937336 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.941751 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.941825 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.941846 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.941874 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.941893 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:14Z","lastTransitionTime":"2026-03-07T06:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.953757 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:14 crc kubenswrapper[4941]: E0307 06:53:14.953924 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:14 crc kubenswrapper[4941]: E0307 06:53:14.960807 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.964808 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.964846 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.964857 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.964875 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.964887 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:14Z","lastTransitionTime":"2026-03-07T06:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:14 crc kubenswrapper[4941]: E0307 06:53:14.982612 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.986881 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.986940 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.986951 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.986971 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:14 crc kubenswrapper[4941]: I0307 06:53:14.986984 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:14Z","lastTransitionTime":"2026-03-07T06:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:15 crc kubenswrapper[4941]: E0307 06:53:15.001964 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.005518 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.005546 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.005557 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.005592 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.005823 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:15Z","lastTransitionTime":"2026-03-07T06:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:15 crc kubenswrapper[4941]: E0307 06:53:15.019640 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: E0307 06:53:15.019864 4941 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.021976 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.022035 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.022050 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.022089 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.022105 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:15Z","lastTransitionTime":"2026-03-07T06:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.125131 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.125194 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.125211 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.125235 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.125252 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:15Z","lastTransitionTime":"2026-03-07T06:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.228044 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.228089 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.228101 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.228121 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.228132 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:15Z","lastTransitionTime":"2026-03-07T06:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.331584 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.331655 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.331674 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.331701 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.331725 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:15Z","lastTransitionTime":"2026-03-07T06:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.385066 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerStarted","Data":"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486"} Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.388368 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" event={"ID":"bf634026-9cb8-4afa-ad8c-e4f119f04899","Type":"ContainerStarted","Data":"ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305"} Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.407690 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.425446 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.434890 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.434945 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.434962 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.434987 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.435005 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:15Z","lastTransitionTime":"2026-03-07T06:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.447660 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.459688 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.471120 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.483500 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.496863 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.512619 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.524956 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.537739 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.537797 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.537809 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.537848 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.537851 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.537889 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:15Z","lastTransitionTime":"2026-03-07T06:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.549923 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.561293 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.575870 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.588240 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:15Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.640511 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.640566 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.640580 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.640599 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.640612 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:15Z","lastTransitionTime":"2026-03-07T06:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.742871 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.742922 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.742935 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.742956 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.742968 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:15Z","lastTransitionTime":"2026-03-07T06:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.845199 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.845247 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.845259 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.845278 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.845291 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:15Z","lastTransitionTime":"2026-03-07T06:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.948012 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.948080 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.948106 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.948140 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.948162 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:15Z","lastTransitionTime":"2026-03-07T06:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.954711 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.954765 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:15 crc kubenswrapper[4941]: I0307 06:53:15.954721 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:15 crc kubenswrapper[4941]: E0307 06:53:15.954879 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:15 crc kubenswrapper[4941]: E0307 06:53:15.954984 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:15 crc kubenswrapper[4941]: E0307 06:53:15.955075 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.050964 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.051011 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.051022 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.051038 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.051049 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:16Z","lastTransitionTime":"2026-03-07T06:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.154204 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.154270 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.154280 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.154334 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.154349 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:16Z","lastTransitionTime":"2026-03-07T06:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.257225 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.257281 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.257292 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.257313 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.257323 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:16Z","lastTransitionTime":"2026-03-07T06:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.361277 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.361369 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.361393 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.361470 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.361495 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:16Z","lastTransitionTime":"2026-03-07T06:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.397100 4941 generic.go:334] "Generic (PLEG): container finished" podID="bf634026-9cb8-4afa-ad8c-e4f119f04899" containerID="ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305" exitCode=0 Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.397181 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" event={"ID":"bf634026-9cb8-4afa-ad8c-e4f119f04899","Type":"ContainerDied","Data":"ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305"} Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.415093 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.439942 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.458547 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.463862 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.463925 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.463944 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.463969 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.463988 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:16Z","lastTransitionTime":"2026-03-07T06:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.482904 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.499585 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.516187 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.536022 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.567993 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.568060 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.568083 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.568112 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.568134 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:16Z","lastTransitionTime":"2026-03-07T06:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.569620 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.584342 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.595493 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.608327 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.622746 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.635360 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.657236 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:16Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.670975 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.671036 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.671051 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.671074 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.671093 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:16Z","lastTransitionTime":"2026-03-07T06:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.774393 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.775086 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.775098 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.775128 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.775145 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:16Z","lastTransitionTime":"2026-03-07T06:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.877092 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.877144 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.877157 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.877174 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.877218 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:16Z","lastTransitionTime":"2026-03-07T06:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.954238 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:16 crc kubenswrapper[4941]: E0307 06:53:16.954422 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.980565 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.980628 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.980640 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.980662 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:16 crc kubenswrapper[4941]: I0307 06:53:16.980673 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:16Z","lastTransitionTime":"2026-03-07T06:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.083945 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.084015 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.084042 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.084072 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.084114 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:17Z","lastTransitionTime":"2026-03-07T06:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.187337 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.187374 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.187382 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.187397 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.187426 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:17Z","lastTransitionTime":"2026-03-07T06:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.289542 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.289599 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.289613 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.289634 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.289649 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:17Z","lastTransitionTime":"2026-03-07T06:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.393306 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.393366 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.393379 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.393419 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.393430 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:17Z","lastTransitionTime":"2026-03-07T06:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.405330 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerStarted","Data":"b3d372778a2aa8cbd40091ef08e57fdf4064ffc777ef757cf4397b0cb2e7d04c"} Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.405666 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.405719 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.412671 4941 generic.go:334] "Generic (PLEG): container finished" podID="bf634026-9cb8-4afa-ad8c-e4f119f04899" containerID="c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e" exitCode=0 Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.412733 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" event={"ID":"bf634026-9cb8-4afa-ad8c-e4f119f04899","Type":"ContainerDied","Data":"c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e"} Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.425594 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.439840 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.447005 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.460588 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.471238 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.486707 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.498437 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.498477 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.498486 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.498504 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.498515 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:17Z","lastTransitionTime":"2026-03-07T06:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.500012 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.516853 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d372778a2aa8cbd40091ef08e57fdf4064ffc777ef757cf4397b0cb2e7d04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.528802 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.540922 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.579623 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.601993 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.602035 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.602045 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.602067 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.602080 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:17Z","lastTransitionTime":"2026-03-07T06:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.612259 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.626424 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.635720 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.646045 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.661022 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.673017 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.680923 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.681138 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:53:25.681109172 +0000 UTC m=+102.633474637 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.681200 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.681281 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.681351 4941 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.681431 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:25.681413659 +0000 UTC m=+102.633779124 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.681450 4941 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.681502 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:25.681491831 +0000 UTC m=+102.633857416 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.686102 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.701970 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.704535 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.704583 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.704594 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.704609 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.704620 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:17Z","lastTransitionTime":"2026-03-07T06:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.712378 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.726198 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.738628 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.759827 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d372778a2aa8cbd40091ef08e57fdf4064ffc777ef757cf4397b0cb2e7d04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.772705 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.781659 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.781710 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.781741 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.781876 4941 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.781903 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.781930 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.781934 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.781972 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.781985 4941 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.781954 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs podName:80030e60-caa3-4aad-8b00-10f5143d9243 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:25.781937226 +0000 UTC m=+102.734302691 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs") pod "network-metrics-daemon-q9fpr" (UID: "80030e60-caa3-4aad-8b00-10f5143d9243") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.782060 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:25.782042458 +0000 UTC m=+102.734407923 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.781950 4941 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.782103 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:25.7820969 +0000 UTC m=+102.734462365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.783445 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.793863 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.804610 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.806569 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.806607 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.806622 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.806641 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.806655 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:17Z","lastTransitionTime":"2026-03-07T06:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.817608 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.829056 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:17Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.909224 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.909259 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.909267 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.909280 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.909290 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:17Z","lastTransitionTime":"2026-03-07T06:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.953915 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.953963 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:17 crc kubenswrapper[4941]: I0307 06:53:17.954031 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.954067 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.954223 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:17 crc kubenswrapper[4941]: E0307 06:53:17.954317 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.011133 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.011535 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.011548 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.011565 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.011575 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:18Z","lastTransitionTime":"2026-03-07T06:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.113934 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.113974 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.113986 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.114005 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.114017 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:18Z","lastTransitionTime":"2026-03-07T06:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.217795 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.217860 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.217878 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.217910 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.217930 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:18Z","lastTransitionTime":"2026-03-07T06:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.320866 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.320911 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.320923 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.320943 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.320955 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:18Z","lastTransitionTime":"2026-03-07T06:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.420788 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" event={"ID":"bf634026-9cb8-4afa-ad8c-e4f119f04899","Type":"ContainerStarted","Data":"cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981"} Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.421790 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.424336 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.424382 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.424446 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.424463 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.424476 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:18Z","lastTransitionTime":"2026-03-07T06:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.436997 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.446158 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.451830 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.466372 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.480984 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.496153 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.510969 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.531516 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.531634 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.531671 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.531766 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.531798 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.531811 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:18Z","lastTransitionTime":"2026-03-07T06:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.548812 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.561877 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.574900 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.589288 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.603002 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.615484 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.634596 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.634641 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.634653 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.634673 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.634688 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:18Z","lastTransitionTime":"2026-03-07T06:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.638160 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d372778a2aa8cbd40091ef08e57fdf4064ffc777ef757cf4397b0cb2e7d04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.652240 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.661349 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.679331 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.692534 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.713456 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.730103 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.738014 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.738054 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.738065 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.738097 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.738112 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:18Z","lastTransitionTime":"2026-03-07T06:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.743655 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.758085 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.777194 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d372778a2aa8cbd40091ef08e57fdf4064ffc777ef757cf4397b0cb2e7d04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.790106 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.800428 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.813810 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.826666 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.841359 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.841429 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.841449 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.841472 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.841487 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:18Z","lastTransitionTime":"2026-03-07T06:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.843153 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:18Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.944279 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.944328 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.944341 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.944358 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.944370 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:18Z","lastTransitionTime":"2026-03-07T06:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:18 crc kubenswrapper[4941]: I0307 06:53:18.954553 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:18 crc kubenswrapper[4941]: E0307 06:53:18.954690 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.046849 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.046894 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.046906 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.046992 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.047004 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:19Z","lastTransitionTime":"2026-03-07T06:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.149836 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.149870 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.149879 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.149894 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.149903 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:19Z","lastTransitionTime":"2026-03-07T06:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.251801 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.251840 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.251849 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.251867 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.251876 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:19Z","lastTransitionTime":"2026-03-07T06:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.354312 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.354362 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.354376 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.354415 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.354430 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:19Z","lastTransitionTime":"2026-03-07T06:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.457231 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.457276 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.457288 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.457305 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.457317 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:19Z","lastTransitionTime":"2026-03-07T06:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.560206 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.560248 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.560261 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.560281 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.560292 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:19Z","lastTransitionTime":"2026-03-07T06:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.663247 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.663293 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.663304 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.663322 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.663335 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:19Z","lastTransitionTime":"2026-03-07T06:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.766057 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.766116 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.766131 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.766162 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.766175 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:19Z","lastTransitionTime":"2026-03-07T06:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.869330 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.869640 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.869701 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.869803 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.869860 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:19Z","lastTransitionTime":"2026-03-07T06:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.954627 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.954641 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:19 crc kubenswrapper[4941]: E0307 06:53:19.954763 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:19 crc kubenswrapper[4941]: E0307 06:53:19.955122 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.955654 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:19 crc kubenswrapper[4941]: E0307 06:53:19.955955 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.972372 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.972647 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.972712 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.972802 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.972864 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:19Z","lastTransitionTime":"2026-03-07T06:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.992524 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 07 06:53:19 crc kubenswrapper[4941]: I0307 06:53:19.993241 4941 scope.go:117] "RemoveContainer" containerID="e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd" Mar 07 06:53:19 crc kubenswrapper[4941]: E0307 06:53:19.993489 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.075647 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.075686 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.075699 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.075716 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.075728 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:20Z","lastTransitionTime":"2026-03-07T06:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.178465 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.178508 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.178517 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.178534 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.178545 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:20Z","lastTransitionTime":"2026-03-07T06:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.281398 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.281474 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.281483 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.281501 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.281512 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:20Z","lastTransitionTime":"2026-03-07T06:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.384825 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.384891 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.384907 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.384929 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.384943 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:20Z","lastTransitionTime":"2026-03-07T06:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.430905 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/0.log" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.435523 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3469f59-621c-4493-ade3-768772d05ebd" containerID="b3d372778a2aa8cbd40091ef08e57fdf4064ffc777ef757cf4397b0cb2e7d04c" exitCode=1 Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.435604 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"b3d372778a2aa8cbd40091ef08e57fdf4064ffc777ef757cf4397b0cb2e7d04c"} Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.436660 4941 scope.go:117] "RemoveContainer" containerID="e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.436697 4941 scope.go:117] "RemoveContainer" containerID="b3d372778a2aa8cbd40091ef08e57fdf4064ffc777ef757cf4397b0cb2e7d04c" Mar 07 06:53:20 crc kubenswrapper[4941]: E0307 06:53:20.436822 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.464988 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.480698 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.488446 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.488638 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.488748 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.488848 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.488935 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:20Z","lastTransitionTime":"2026-03-07T06:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.494363 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.509164 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.526171 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.548818 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.561017 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.578420 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.592043 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.592080 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.592116 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.592138 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.592150 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:20Z","lastTransitionTime":"2026-03-07T06:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.593987 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.618888 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d372778a2aa8cbd40091ef08e57fdf4064ffc777ef757cf4397b0cb2e7d04c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d372778a2aa8cbd40091ef08e57fdf4064ffc777ef757cf4397b0cb2e7d04c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:19Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 06:53:19.807443 6759 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 06:53:19.808393 6759 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 06:53:19.808422 6759 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 06:53:19.808444 6759 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0307 06:53:19.808449 6759 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0307 06:53:19.808464 6759 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0307 06:53:19.808500 6759 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0307 06:53:19.808521 6759 factory.go:656] Stopping watch factory\\\\nI0307 06:53:19.808540 6759 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0307 06:53:19.808551 6759 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 06:53:19.808557 6759 handler.go:208] Removed *v1.Node event handler 7\\\\nI0307 06:53:19.808564 6759 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 06:53:19.808570 6759 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 06:53:19.808576 6759 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.634924 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.647624 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.662273 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.675952 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.693792 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:20Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.695783 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.695821 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.695832 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.695852 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.695863 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:20Z","lastTransitionTime":"2026-03-07T06:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.798717 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.798770 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.798786 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.798807 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.798823 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:20Z","lastTransitionTime":"2026-03-07T06:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.901163 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.901219 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.901231 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.901250 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.901263 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:20Z","lastTransitionTime":"2026-03-07T06:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.953788 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:20 crc kubenswrapper[4941]: E0307 06:53:20.953924 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:20 crc kubenswrapper[4941]: I0307 06:53:20.967907 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.003328 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.003363 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.003374 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.003390 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.003417 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:21Z","lastTransitionTime":"2026-03-07T06:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.106206 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.106270 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.106287 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.106312 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.106337 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:21Z","lastTransitionTime":"2026-03-07T06:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.208521 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.208555 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.208564 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.208577 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.208587 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:21Z","lastTransitionTime":"2026-03-07T06:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.310867 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.310906 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.310919 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.310935 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.310946 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:21Z","lastTransitionTime":"2026-03-07T06:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.413275 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.413305 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.413313 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.413327 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.413337 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:21Z","lastTransitionTime":"2026-03-07T06:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.440734 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/1.log" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.441449 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/0.log" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.443554 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3469f59-621c-4493-ade3-768772d05ebd" containerID="6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d" exitCode=1 Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.444596 4941 scope.go:117] "RemoveContainer" containerID="6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d" Mar 07 06:53:21 crc kubenswrapper[4941]: E0307 06:53:21.444717 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.444758 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d"} Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.444782 4941 scope.go:117] "RemoveContainer" containerID="b3d372778a2aa8cbd40091ef08e57fdf4064ffc777ef757cf4397b0cb2e7d04c" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.471269 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.483027 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.494573 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.511238 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.516002 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.516033 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.516044 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.516057 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.516067 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:21Z","lastTransitionTime":"2026-03-07T06:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.523663 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.533993 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.552548 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.566189 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.578621 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.597766 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d372778a2aa8cbd40091ef08e57fdf4064ffc777ef757cf4397b0cb2e7d04c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:19Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 06:53:19.807443 6759 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0307 06:53:19.808393 6759 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 06:53:19.808422 6759 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 06:53:19.808444 6759 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0307 06:53:19.808449 6759 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0307 06:53:19.808464 6759 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0307 06:53:19.808500 6759 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0307 06:53:19.808521 6759 factory.go:656] Stopping watch factory\\\\nI0307 06:53:19.808540 6759 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0307 06:53:19.808551 6759 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 06:53:19.808557 6759 handler.go:208] Removed *v1.Node event handler 7\\\\nI0307 06:53:19.808564 6759 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 06:53:19.808570 6759 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 06:53:19.808576 6759 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:21Z\\\",\\\"message\\\":\\\"ers/externalversions/factory.go:140\\\\nI0307 06:53:21.260819 6899 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0307 06:53:21.260803 6899 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0307 06:53:21.261051 6899 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0307 06:53:21.261079 6899 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0307 06:53:21.260754 6899 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 06:53:21.261093 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 06:53:21.261111 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 06:53:21.261174 6899 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0307 06:53:21.261711 6899 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 06:53:21.261905 6899 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 06:53:21.261923 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 06:53:21.261944 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 06:53:21.261957 6899 factory.go:656] Stopping watch factory\\\\nI0307 06:53:21.261967 6899 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.612214 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.620086 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.620120 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.620131 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.620151 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.620162 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:21Z","lastTransitionTime":"2026-03-07T06:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.625005 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.646996 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.664841 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.679795 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.696597 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:21Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.729576 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.729617 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.729628 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.729646 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.729658 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:21Z","lastTransitionTime":"2026-03-07T06:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.833399 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.833866 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.834040 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.834263 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.834299 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:21Z","lastTransitionTime":"2026-03-07T06:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.937373 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.937453 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.937468 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.937485 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.937500 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:21Z","lastTransitionTime":"2026-03-07T06:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.953753 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.953813 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:21 crc kubenswrapper[4941]: I0307 06:53:21.953820 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:21 crc kubenswrapper[4941]: E0307 06:53:21.953938 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:21 crc kubenswrapper[4941]: E0307 06:53:21.954179 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:21 crc kubenswrapper[4941]: E0307 06:53:21.954300 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.040540 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.040602 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.040621 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.040651 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.040669 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:22Z","lastTransitionTime":"2026-03-07T06:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.143582 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.143632 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.143646 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.143670 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.143686 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:22Z","lastTransitionTime":"2026-03-07T06:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.247671 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.247737 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.247760 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.247787 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.247808 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:22Z","lastTransitionTime":"2026-03-07T06:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.351653 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.351713 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.351732 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.351758 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.351781 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:22Z","lastTransitionTime":"2026-03-07T06:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.449776 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/1.log" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.453742 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.453778 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.453789 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.453813 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.453828 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:22Z","lastTransitionTime":"2026-03-07T06:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.455620 4941 scope.go:117] "RemoveContainer" containerID="6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d" Mar 07 06:53:22 crc kubenswrapper[4941]: E0307 06:53:22.456029 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.472878 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.492359 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.507455 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.525020 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.551434 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.556760 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.556808 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.556818 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.556834 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.556844 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:22Z","lastTransitionTime":"2026-03-07T06:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.571348 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.590486 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.611452 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.626783 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.638599 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.659724 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.659774 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.659787 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.659809 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.659822 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:22Z","lastTransitionTime":"2026-03-07T06:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.689526 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.702747 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.727166 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:21Z\\\",\\\"message\\\":\\\"ers/externalversions/factory.go:140\\\\nI0307 06:53:21.260819 6899 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0307 06:53:21.260803 6899 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0307 06:53:21.261051 6899 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0307 06:53:21.261079 6899 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0307 06:53:21.260754 6899 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 06:53:21.261093 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 06:53:21.261111 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 06:53:21.261174 6899 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0307 06:53:21.261711 6899 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 06:53:21.261905 6899 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 06:53:21.261923 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 06:53:21.261944 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 06:53:21.261957 6899 factory.go:656] Stopping watch factory\\\\nI0307 06:53:21.261967 6899 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.741899 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.754467 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.762306 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.762353 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.762367 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.762382 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.762395 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:22Z","lastTransitionTime":"2026-03-07T06:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.768669 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:22Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.864581 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.864618 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.864627 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.864641 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.864652 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:22Z","lastTransitionTime":"2026-03-07T06:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.953647 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:22 crc kubenswrapper[4941]: E0307 06:53:22.953786 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.967475 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.967519 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.967532 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.967549 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:22 crc kubenswrapper[4941]: I0307 06:53:22.967562 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:22Z","lastTransitionTime":"2026-03-07T06:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.071251 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.071335 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.071357 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.071388 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.071446 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:23Z","lastTransitionTime":"2026-03-07T06:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.174482 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.174523 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.174535 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.174555 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.174569 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:23Z","lastTransitionTime":"2026-03-07T06:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.277874 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.277928 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.277938 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.277986 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.278000 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:23Z","lastTransitionTime":"2026-03-07T06:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.381305 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.381343 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.381353 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.381372 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.381382 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:23Z","lastTransitionTime":"2026-03-07T06:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.484887 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.484935 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.484946 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.484992 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.485006 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:23Z","lastTransitionTime":"2026-03-07T06:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.588502 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.588563 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.588580 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.588604 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.588621 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:23Z","lastTransitionTime":"2026-03-07T06:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.691904 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.691953 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.691964 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.691983 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.691996 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:23Z","lastTransitionTime":"2026-03-07T06:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.796279 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.796337 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.796348 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.796373 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.796392 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:23Z","lastTransitionTime":"2026-03-07T06:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.898952 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.899030 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.899052 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.899071 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.899086 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:23Z","lastTransitionTime":"2026-03-07T06:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.954013 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.954157 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:23 crc kubenswrapper[4941]: E0307 06:53:23.954174 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:23 crc kubenswrapper[4941]: E0307 06:53:23.954491 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.954678 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:23 crc kubenswrapper[4941]: E0307 06:53:23.954802 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.975639 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:23Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:23 crc kubenswrapper[4941]: I0307 06:53:23.990942 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:23Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.001238 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.001281 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.001295 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.001316 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.001331 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:24Z","lastTransitionTime":"2026-03-07T06:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.005292 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.022161 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.034092 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.044792 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.058448 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.072846 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.086977 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.104087 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.104452 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.104557 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.104643 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.104724 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:24Z","lastTransitionTime":"2026-03-07T06:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.105535 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:21Z\\\",\\\"message\\\":\\\"ers/externalversions/factory.go:140\\\\nI0307 06:53:21.260819 6899 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0307 06:53:21.260803 6899 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0307 06:53:21.261051 6899 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0307 06:53:21.261079 6899 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0307 06:53:21.260754 6899 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 06:53:21.261093 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 06:53:21.261111 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 06:53:21.261174 6899 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0307 06:53:21.261711 6899 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 06:53:21.261905 6899 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 06:53:21.261923 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 06:53:21.261944 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 06:53:21.261957 6899 factory.go:656] Stopping watch factory\\\\nI0307 06:53:21.261967 6899 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.116804 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.128291 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.150842 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.169678 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.185091 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.202185 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:24Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.206944 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.206977 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.206990 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.207008 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.207021 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:24Z","lastTransitionTime":"2026-03-07T06:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.310522 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.310570 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.310580 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.310623 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.310634 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:24Z","lastTransitionTime":"2026-03-07T06:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.412985 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.413034 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.413045 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.413062 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.413078 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:24Z","lastTransitionTime":"2026-03-07T06:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.516654 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.516717 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.516737 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.516761 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.516780 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:24Z","lastTransitionTime":"2026-03-07T06:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.619240 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.619289 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.619302 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.619320 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.619333 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:24Z","lastTransitionTime":"2026-03-07T06:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.722268 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.722330 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.722350 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.722378 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.722395 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:24Z","lastTransitionTime":"2026-03-07T06:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.825735 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.825815 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.825834 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.825862 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.825881 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:24Z","lastTransitionTime":"2026-03-07T06:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.929679 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.929742 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.929755 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.929777 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.929790 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:24Z","lastTransitionTime":"2026-03-07T06:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:24 crc kubenswrapper[4941]: I0307 06:53:24.954194 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:24 crc kubenswrapper[4941]: E0307 06:53:24.954342 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.034083 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.034131 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.034149 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.034175 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.034193 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.136959 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.137034 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.137054 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.137083 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.137110 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.217811 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.217899 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.217919 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.217946 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.217968 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.237908 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:25Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.243579 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.243646 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.243659 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.243686 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.243699 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.256307 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:25Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.260376 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.260440 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.260452 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.260469 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.260480 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.274691 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:25Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.280318 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.280360 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.280374 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.280392 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.280428 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.295524 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:25Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.300802 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.300844 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.300857 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.300877 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.300891 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.316783 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:25Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.316935 4941 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.318959 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.318992 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.319001 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.319018 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.319029 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.422001 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.422041 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.422053 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.422069 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.422081 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.525309 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.525378 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.525391 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.525474 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.525491 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.629592 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.629638 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.629646 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.629669 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.629678 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.733472 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.733572 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.733597 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.733634 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.733659 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.774588 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.774812 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:53:41.774780062 +0000 UTC m=+118.727145567 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.774953 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.775044 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.775147 4941 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.775214 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:41.775199652 +0000 UTC m=+118.727565137 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.775222 4941 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.775391 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:41.775330116 +0000 UTC m=+118.727695591 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.836694 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.837244 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.837805 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.837836 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.837880 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.875808 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.875861 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.875882 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.876050 4941 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.876108 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs podName:80030e60-caa3-4aad-8b00-10f5143d9243 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:41.876093048 +0000 UTC m=+118.828458513 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs") pod "network-metrics-daemon-q9fpr" (UID: "80030e60-caa3-4aad-8b00-10f5143d9243") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.876105 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.876155 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.876169 4941 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.876171 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.876219 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.876239 4941 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.876243 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:41.876223241 +0000 UTC m=+118.828588756 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.876311 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 06:53:41.876289403 +0000 UTC m=+118.828654928 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.941570 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.941632 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.941641 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.941658 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.941671 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:25Z","lastTransitionTime":"2026-03-07T06:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.954028 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.954071 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.954145 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.954279 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:25 crc kubenswrapper[4941]: I0307 06:53:25.954314 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:25 crc kubenswrapper[4941]: E0307 06:53:25.954428 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.045616 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.045684 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.045708 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.045738 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.045766 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:26Z","lastTransitionTime":"2026-03-07T06:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.148290 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.148328 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.148337 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.148351 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.148363 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:26Z","lastTransitionTime":"2026-03-07T06:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.250812 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.250864 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.250875 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.250889 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.250899 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:26Z","lastTransitionTime":"2026-03-07T06:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.354222 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.354263 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.354274 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.354290 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.354301 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:26Z","lastTransitionTime":"2026-03-07T06:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.457580 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.457628 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.457641 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.457658 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.457669 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:26Z","lastTransitionTime":"2026-03-07T06:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.560840 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.560905 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.560918 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.560940 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.560953 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:26Z","lastTransitionTime":"2026-03-07T06:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.663688 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.664219 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.664239 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.664256 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.664268 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:26Z","lastTransitionTime":"2026-03-07T06:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.767732 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.767810 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.767826 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.767864 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.767896 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:26Z","lastTransitionTime":"2026-03-07T06:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.870517 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.870569 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.870580 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.870596 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.870611 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:26Z","lastTransitionTime":"2026-03-07T06:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.953898 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:26 crc kubenswrapper[4941]: E0307 06:53:26.954060 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.973067 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.973135 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.973145 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.973159 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:26 crc kubenswrapper[4941]: I0307 06:53:26.973168 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:26Z","lastTransitionTime":"2026-03-07T06:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.076504 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.076542 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.076552 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.076566 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.076576 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:27Z","lastTransitionTime":"2026-03-07T06:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.180037 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.180074 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.180083 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.180146 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.180158 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:27Z","lastTransitionTime":"2026-03-07T06:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.283638 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.283702 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.283716 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.283733 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.283747 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:27Z","lastTransitionTime":"2026-03-07T06:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.386994 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.387038 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.387048 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.387067 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.387077 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:27Z","lastTransitionTime":"2026-03-07T06:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.491185 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.491237 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.491246 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.491266 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.491279 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:27Z","lastTransitionTime":"2026-03-07T06:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.593682 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.593723 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.593733 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.593767 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.593781 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:27Z","lastTransitionTime":"2026-03-07T06:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.696661 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.696714 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.696757 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.696807 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.696821 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:27Z","lastTransitionTime":"2026-03-07T06:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.798924 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.799021 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.799055 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.799084 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.799106 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:27Z","lastTransitionTime":"2026-03-07T06:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.902369 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.902423 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.902432 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.902446 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.902455 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:27Z","lastTransitionTime":"2026-03-07T06:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.953700 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:27 crc kubenswrapper[4941]: E0307 06:53:27.953842 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.954192 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:27 crc kubenswrapper[4941]: E0307 06:53:27.954248 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:27 crc kubenswrapper[4941]: I0307 06:53:27.954325 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:27 crc kubenswrapper[4941]: E0307 06:53:27.954371 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.005186 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.005239 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.005249 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.005270 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.005283 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:28Z","lastTransitionTime":"2026-03-07T06:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.107848 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.107904 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.107915 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.107933 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.107945 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:28Z","lastTransitionTime":"2026-03-07T06:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.211226 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.211271 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.211291 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.211308 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.211318 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:28Z","lastTransitionTime":"2026-03-07T06:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.313963 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.314031 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.314052 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.314087 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.314104 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:28Z","lastTransitionTime":"2026-03-07T06:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.417211 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.417270 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.417284 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.417304 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.417321 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:28Z","lastTransitionTime":"2026-03-07T06:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.519801 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.519877 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.519895 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.519920 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.519948 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:28Z","lastTransitionTime":"2026-03-07T06:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.622906 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.622979 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.622992 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.623011 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.623023 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:28Z","lastTransitionTime":"2026-03-07T06:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.725896 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.725942 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.725954 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.725973 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.725988 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:28Z","lastTransitionTime":"2026-03-07T06:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.828566 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.828613 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.828624 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.828642 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.828658 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:28Z","lastTransitionTime":"2026-03-07T06:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.930738 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.930798 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.930813 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.930836 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.930848 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:28Z","lastTransitionTime":"2026-03-07T06:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:28 crc kubenswrapper[4941]: I0307 06:53:28.954473 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:28 crc kubenswrapper[4941]: E0307 06:53:28.954685 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.033932 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.034039 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.034053 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.034078 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.034102 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:29Z","lastTransitionTime":"2026-03-07T06:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.137224 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.137260 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.137270 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.137291 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.137303 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:29Z","lastTransitionTime":"2026-03-07T06:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.240813 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.240866 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.240877 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.240896 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.240909 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:29Z","lastTransitionTime":"2026-03-07T06:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.343939 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.344001 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.344018 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.344044 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.344063 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:29Z","lastTransitionTime":"2026-03-07T06:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.446389 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.446457 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.446470 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.446492 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.446511 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:29Z","lastTransitionTime":"2026-03-07T06:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.549678 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.549728 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.549737 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.549754 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.549766 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:29Z","lastTransitionTime":"2026-03-07T06:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.651849 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.651910 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.651925 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.651944 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.651956 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:29Z","lastTransitionTime":"2026-03-07T06:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.755392 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.755465 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.755476 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.755493 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.755503 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:29Z","lastTransitionTime":"2026-03-07T06:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.858707 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.858775 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.858794 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.858820 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.858841 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:29Z","lastTransitionTime":"2026-03-07T06:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.954337 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.954640 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.954706 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:29 crc kubenswrapper[4941]: E0307 06:53:29.954859 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:29 crc kubenswrapper[4941]: E0307 06:53:29.954992 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:29 crc kubenswrapper[4941]: E0307 06:53:29.955183 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.961474 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.961797 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.961878 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.961941 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.962011 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:29Z","lastTransitionTime":"2026-03-07T06:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:29 crc kubenswrapper[4941]: I0307 06:53:29.970472 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.608928 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.608984 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.609001 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.609028 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.609047 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:30Z","lastTransitionTime":"2026-03-07T06:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.713864 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.714350 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.714604 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.714806 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.715011 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:30Z","lastTransitionTime":"2026-03-07T06:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.818106 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.818159 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.818170 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.818188 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.818201 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:30Z","lastTransitionTime":"2026-03-07T06:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.921256 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.921318 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.921328 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.921341 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.921351 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:30Z","lastTransitionTime":"2026-03-07T06:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:30 crc kubenswrapper[4941]: I0307 06:53:30.953886 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:30 crc kubenswrapper[4941]: E0307 06:53:30.954053 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.023883 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.023958 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.023971 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.023989 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.024003 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:31Z","lastTransitionTime":"2026-03-07T06:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.126903 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.126951 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.126962 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.126988 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.127003 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:31Z","lastTransitionTime":"2026-03-07T06:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.229494 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.229540 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.229551 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.229568 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.229578 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:31Z","lastTransitionTime":"2026-03-07T06:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.332995 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.333047 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.333058 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.333074 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.333084 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:31Z","lastTransitionTime":"2026-03-07T06:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.436220 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.436268 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.436281 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.436299 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.436312 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:31Z","lastTransitionTime":"2026-03-07T06:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.537936 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.537994 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.538012 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.538037 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.538054 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:31Z","lastTransitionTime":"2026-03-07T06:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.640180 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.640243 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.640265 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.640289 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.640307 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:31Z","lastTransitionTime":"2026-03-07T06:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.743218 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.743257 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.743267 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.743283 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.743294 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:31Z","lastTransitionTime":"2026-03-07T06:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.846802 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.846866 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.846882 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.846910 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.846927 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:31Z","lastTransitionTime":"2026-03-07T06:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.949824 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.949892 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.949911 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.949936 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.949959 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:31Z","lastTransitionTime":"2026-03-07T06:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.954372 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.954516 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:31 crc kubenswrapper[4941]: E0307 06:53:31.954761 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:31 crc kubenswrapper[4941]: E0307 06:53:31.954535 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:31 crc kubenswrapper[4941]: I0307 06:53:31.955053 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:31 crc kubenswrapper[4941]: E0307 06:53:31.955266 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.053305 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.053361 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.053377 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.053422 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.053435 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:32Z","lastTransitionTime":"2026-03-07T06:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.156045 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.156103 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.156115 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.156130 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.156142 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:32Z","lastTransitionTime":"2026-03-07T06:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.259051 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.259089 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.259098 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.259112 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.259122 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:32Z","lastTransitionTime":"2026-03-07T06:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.361818 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.361865 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.361877 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.361901 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.361914 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:32Z","lastTransitionTime":"2026-03-07T06:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.465204 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.465261 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.465274 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.465290 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.465307 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:32Z","lastTransitionTime":"2026-03-07T06:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.568247 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.568304 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.568321 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.568342 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.568359 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:32Z","lastTransitionTime":"2026-03-07T06:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.671589 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.671643 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.671656 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.671675 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.671690 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:32Z","lastTransitionTime":"2026-03-07T06:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.774826 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.774866 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.774876 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.774892 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.774902 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:32Z","lastTransitionTime":"2026-03-07T06:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.878371 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.878449 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.878462 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.878478 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.878493 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:32Z","lastTransitionTime":"2026-03-07T06:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.953622 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:32 crc kubenswrapper[4941]: E0307 06:53:32.953854 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.981018 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.981091 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.981114 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.981141 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:32 crc kubenswrapper[4941]: I0307 06:53:32.981164 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:32Z","lastTransitionTime":"2026-03-07T06:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.084812 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.084882 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.084899 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.084924 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.084939 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:33Z","lastTransitionTime":"2026-03-07T06:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.189192 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.189270 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.189290 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.189315 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.189335 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:33Z","lastTransitionTime":"2026-03-07T06:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.292568 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.292639 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.292652 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.292672 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.292685 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:33Z","lastTransitionTime":"2026-03-07T06:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.395851 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.395903 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.395914 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.395930 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.395942 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:33Z","lastTransitionTime":"2026-03-07T06:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.498842 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.498888 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.498902 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.498919 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.498933 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:33Z","lastTransitionTime":"2026-03-07T06:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.601479 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.601564 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.601577 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.601596 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.601610 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:33Z","lastTransitionTime":"2026-03-07T06:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.705340 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.706068 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.706334 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.706365 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.706520 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:33Z","lastTransitionTime":"2026-03-07T06:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.809343 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.809824 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.810029 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.810181 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.810276 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:33Z","lastTransitionTime":"2026-03-07T06:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.913459 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.913503 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.913512 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.913527 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.913537 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:33Z","lastTransitionTime":"2026-03-07T06:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.954475 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.954668 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:33 crc kubenswrapper[4941]: E0307 06:53:33.954832 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.954999 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:33 crc kubenswrapper[4941]: E0307 06:53:33.955302 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:33 crc kubenswrapper[4941]: E0307 06:53:33.955434 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.956144 4941 scope.go:117] "RemoveContainer" containerID="e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.976866 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:33Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:33 crc kubenswrapper[4941]: I0307 06:53:33.995354 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:33Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.016845 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.016880 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.016888 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.016902 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.016913 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:34Z","lastTransitionTime":"2026-03-07T06:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.019746 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.036834 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.049238 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.063534 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.073781 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.085222 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.100691 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.112678 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.119390 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.119454 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.119467 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.119493 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.119510 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:34Z","lastTransitionTime":"2026-03-07T06:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.132205 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.143991 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.166301 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.183764 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.200268 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.213312 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.221701 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.221756 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.221765 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.221786 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.221798 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:34Z","lastTransitionTime":"2026-03-07T06:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.233359 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:21Z\\\",\\\"message\\\":\\\"ers/externalversions/factory.go:140\\\\nI0307 06:53:21.260819 6899 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0307 06:53:21.260803 6899 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0307 06:53:21.261051 6899 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0307 06:53:21.261079 6899 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0307 06:53:21.260754 6899 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 06:53:21.261093 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 06:53:21.261111 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 06:53:21.261174 6899 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0307 06:53:21.261711 6899 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 06:53:21.261905 6899 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 06:53:21.261923 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 06:53:21.261944 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 06:53:21.261957 6899 factory.go:656] Stopping watch factory\\\\nI0307 06:53:21.261967 6899 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.324599 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.324723 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.324759 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.324793 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.324822 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:34Z","lastTransitionTime":"2026-03-07T06:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.427685 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.427720 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.427729 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.427743 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.427752 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:34Z","lastTransitionTime":"2026-03-07T06:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.530646 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.530684 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.530694 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.530711 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.530721 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:34Z","lastTransitionTime":"2026-03-07T06:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.627204 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.629518 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a"} Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.629778 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.633023 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.633096 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.633108 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.633131 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.633145 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:34Z","lastTransitionTime":"2026-03-07T06:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.654014 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:21Z\\\",\\\"message\\\":\\\"ers/externalversions/factory.go:140\\\\nI0307 06:53:21.260819 6899 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0307 06:53:21.260803 6899 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0307 06:53:21.261051 6899 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0307 06:53:21.261079 6899 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0307 06:53:21.260754 6899 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 06:53:21.261093 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 06:53:21.261111 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 06:53:21.261174 6899 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0307 06:53:21.261711 6899 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 06:53:21.261905 6899 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 06:53:21.261923 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 06:53:21.261944 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 06:53:21.261957 6899 factory.go:656] Stopping watch factory\\\\nI0307 06:53:21.261967 6899 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.666926 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.679414 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.697674 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.711580 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.725199 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.736528 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.736575 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.736593 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.736618 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.736638 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:34Z","lastTransitionTime":"2026-03-07T06:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.741693 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.757611 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.775219 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.800967 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.818171 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.834039 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.839066 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.839097 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.839110 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.839127 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.839141 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:34Z","lastTransitionTime":"2026-03-07T06:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.852512 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.864415 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.883270 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.899288 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.911538 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:34Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.942032 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.942071 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.942082 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.942099 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.942112 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:34Z","lastTransitionTime":"2026-03-07T06:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:34 crc kubenswrapper[4941]: I0307 06:53:34.954201 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:34 crc kubenswrapper[4941]: E0307 06:53:34.954437 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.044630 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.044667 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.044676 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.044693 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.044705 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.147928 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.147994 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.148011 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.148035 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.148052 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.250893 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.250937 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.250946 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.250965 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.250975 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.351776 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.351833 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.351846 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.351866 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.351879 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:35 crc kubenswrapper[4941]: E0307 06:53:35.375881 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:35Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.381051 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.381111 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.381124 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.381145 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.381158 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:35 crc kubenswrapper[4941]: E0307 06:53:35.396792 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:35Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.402704 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.402764 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.402777 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.402795 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.402808 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:35 crc kubenswrapper[4941]: E0307 06:53:35.418118 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:35Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.422743 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.422792 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.422805 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.422821 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.422834 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:35 crc kubenswrapper[4941]: E0307 06:53:35.439667 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:35Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.444522 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.444588 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.444606 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.444629 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.444643 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:35 crc kubenswrapper[4941]: E0307 06:53:35.458672 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:35Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:35 crc kubenswrapper[4941]: E0307 06:53:35.458857 4941 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.460931 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.460972 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.460984 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.461002 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.461016 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.564223 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.564297 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.564310 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.564330 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.564343 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.667129 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.667192 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.667204 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.667224 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.667239 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.770310 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.770371 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.770385 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.770423 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.770438 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.873719 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.873786 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.873795 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.873817 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.873829 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.954357 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:35 crc kubenswrapper[4941]: E0307 06:53:35.954595 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.954715 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.955021 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:35 crc kubenswrapper[4941]: E0307 06:53:35.955189 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.955521 4941 scope.go:117] "RemoveContainer" containerID="6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d" Mar 07 06:53:35 crc kubenswrapper[4941]: E0307 06:53:35.955652 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.976420 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.976456 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.976465 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.976483 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:35 crc kubenswrapper[4941]: I0307 06:53:35.976492 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:35Z","lastTransitionTime":"2026-03-07T06:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.079567 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.079611 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.079621 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.079637 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.079647 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:36Z","lastTransitionTime":"2026-03-07T06:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.183147 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.183542 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.183552 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.183568 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.183581 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:36Z","lastTransitionTime":"2026-03-07T06:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.286348 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.286421 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.286431 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.286447 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.286459 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:36Z","lastTransitionTime":"2026-03-07T06:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.389165 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.389220 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.389230 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.389252 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.389265 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:36Z","lastTransitionTime":"2026-03-07T06:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.492823 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.492888 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.492903 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.492927 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.492943 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:36Z","lastTransitionTime":"2026-03-07T06:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.595362 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.595422 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.595433 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.595447 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.595456 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:36Z","lastTransitionTime":"2026-03-07T06:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.638100 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/1.log" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.646314 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerStarted","Data":"954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613"} Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.646729 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.662924 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.679201 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.693213 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.697771 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.697809 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.697817 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.697832 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.697841 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:36Z","lastTransitionTime":"2026-03-07T06:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.704138 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.727675 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.738192 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.754970 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.768479 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.783317 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.800506 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.800571 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.800589 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.800609 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.800623 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:36Z","lastTransitionTime":"2026-03-07T06:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.804727 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:21Z\\\",\\\"message\\\":\\\"ers/externalversions/factory.go:140\\\\nI0307 06:53:21.260819 6899 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0307 06:53:21.260803 6899 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0307 06:53:21.261051 6899 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0307 06:53:21.261079 6899 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0307 06:53:21.260754 6899 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 06:53:21.261093 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 06:53:21.261111 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 06:53:21.261174 6899 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0307 06:53:21.261711 6899 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 06:53:21.261905 6899 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 06:53:21.261923 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 06:53:21.261944 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 06:53:21.261957 6899 factory.go:656] Stopping watch factory\\\\nI0307 06:53:21.261967 6899 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.816197 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.828329 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.852645 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.867174 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.884021 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.897960 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.902803 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.902834 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.902847 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.902865 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.902878 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:36Z","lastTransitionTime":"2026-03-07T06:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.911235 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:36 crc kubenswrapper[4941]: I0307 06:53:36.953981 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:36 crc kubenswrapper[4941]: E0307 06:53:36.954087 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.004982 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.005038 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.005046 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.005061 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.005073 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:37Z","lastTransitionTime":"2026-03-07T06:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.107914 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.107974 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.107985 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.108001 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.108016 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:37Z","lastTransitionTime":"2026-03-07T06:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.211835 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.211919 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.211944 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.211973 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.211997 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:37Z","lastTransitionTime":"2026-03-07T06:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.314595 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.314662 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.314680 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.314707 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.314726 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:37Z","lastTransitionTime":"2026-03-07T06:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.417870 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.417941 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.417959 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.417985 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.418005 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:37Z","lastTransitionTime":"2026-03-07T06:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.521451 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.521497 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.521513 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.521564 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.521579 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:37Z","lastTransitionTime":"2026-03-07T06:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.624037 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.624081 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.624090 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.624130 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.624152 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:37Z","lastTransitionTime":"2026-03-07T06:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.652787 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/2.log" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.653304 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/1.log" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.656438 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3469f59-621c-4493-ade3-768772d05ebd" containerID="954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613" exitCode=1 Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.656498 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613"} Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.656551 4941 scope.go:117] "RemoveContainer" containerID="6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.657577 4941 scope.go:117] "RemoveContainer" containerID="954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613" Mar 07 06:53:37 crc kubenswrapper[4941]: E0307 06:53:37.657862 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.673741 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.688369 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.701899 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.715037 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.728326 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.728376 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.728391 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.728431 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.728446 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:37Z","lastTransitionTime":"2026-03-07T06:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.736976 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.748637 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.760504 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.774617 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.789508 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.803675 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.830585 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6762ba99e54272fe1b7c9c949f77de215d75168a44cbac08f859d18d5d49502d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:21Z\\\",\\\"message\\\":\\\"ers/externalversions/factory.go:140\\\\nI0307 06:53:21.260819 6899 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0307 06:53:21.260803 6899 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0307 06:53:21.261051 6899 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0307 06:53:21.261079 6899 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0307 06:53:21.260754 6899 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 06:53:21.261093 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 06:53:21.261111 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 06:53:21.261174 6899 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0307 06:53:21.261711 6899 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 06:53:21.261905 6899 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 06:53:21.261923 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 06:53:21.261944 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 06:53:21.261957 6899 factory.go:656] Stopping watch factory\\\\nI0307 06:53:21.261967 6899 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:36Z\\\",\\\"message\\\":\\\"_uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 06:53:36.773003 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z]\\\\nI0307 06:53:36.773779 7105 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_sn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.831363 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.831423 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.831440 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.831459 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.831473 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:37Z","lastTransitionTime":"2026-03-07T06:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.845558 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.858377 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.880372 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.896636 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.916680 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.932306 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:37Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.933767 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.933807 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.933816 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.933831 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.933842 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:37Z","lastTransitionTime":"2026-03-07T06:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.954860 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.955016 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:37 crc kubenswrapper[4941]: E0307 06:53:37.955062 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:37 crc kubenswrapper[4941]: I0307 06:53:37.955129 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:37 crc kubenswrapper[4941]: E0307 06:53:37.955230 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:37 crc kubenswrapper[4941]: E0307 06:53:37.955354 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.036842 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.036893 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.036903 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.036923 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.036934 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:38Z","lastTransitionTime":"2026-03-07T06:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.140274 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.140883 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.140915 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.140942 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.140963 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:38Z","lastTransitionTime":"2026-03-07T06:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.244077 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.244136 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.244149 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.244171 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.244190 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:38Z","lastTransitionTime":"2026-03-07T06:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.347335 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.347393 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.347445 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.347473 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.347491 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:38Z","lastTransitionTime":"2026-03-07T06:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.450521 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.451055 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.451256 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.451435 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.451592 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:38Z","lastTransitionTime":"2026-03-07T06:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.554750 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.555134 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.555243 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.555352 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.555468 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:38Z","lastTransitionTime":"2026-03-07T06:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.658314 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.658740 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.659039 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.659262 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.659506 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:38Z","lastTransitionTime":"2026-03-07T06:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.661975 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/2.log" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.666597 4941 scope.go:117] "RemoveContainer" containerID="954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613" Mar 07 06:53:38 crc kubenswrapper[4941]: E0307 06:53:38.666982 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.690018 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.711427 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.728023 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.746418 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.763025 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.763103 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.763119 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.763139 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.763151 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:38Z","lastTransitionTime":"2026-03-07T06:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.775927 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.793291 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.810688 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.823642 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.841028 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.852913 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.867491 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.867537 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.867552 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.867575 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.867592 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:38Z","lastTransitionTime":"2026-03-07T06:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.870369 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.884172 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.897674 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.920930 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:36Z\\\",\\\"message\\\":\\\"_uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 06:53:36.773003 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z]\\\\nI0307 06:53:36.773779 7105 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_sn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.935059 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.948907 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.953802 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:38 crc kubenswrapper[4941]: E0307 06:53:38.953966 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.965014 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:38Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.970194 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.970274 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.970325 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.970351 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:38 crc kubenswrapper[4941]: I0307 06:53:38.970366 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:38Z","lastTransitionTime":"2026-03-07T06:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.073544 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.073607 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.073621 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.073641 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.073655 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:39Z","lastTransitionTime":"2026-03-07T06:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.177223 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.177279 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.177290 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.177306 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.177319 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:39Z","lastTransitionTime":"2026-03-07T06:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.279744 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.279831 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.279855 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.279879 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.279898 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:39Z","lastTransitionTime":"2026-03-07T06:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.383517 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.383568 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.383580 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.383606 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.383618 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:39Z","lastTransitionTime":"2026-03-07T06:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.486769 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.486848 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.486861 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.486909 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.486922 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:39Z","lastTransitionTime":"2026-03-07T06:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.590050 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.590128 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.590137 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.590157 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.590170 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:39Z","lastTransitionTime":"2026-03-07T06:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.693018 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.693090 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.693101 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.693135 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.693148 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:39Z","lastTransitionTime":"2026-03-07T06:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.796580 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.796617 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.796627 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.796641 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.796651 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:39Z","lastTransitionTime":"2026-03-07T06:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.899916 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.899982 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.899999 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.900025 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.900043 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:39Z","lastTransitionTime":"2026-03-07T06:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.954579 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.954672 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:39 crc kubenswrapper[4941]: I0307 06:53:39.954625 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:39 crc kubenswrapper[4941]: E0307 06:53:39.954799 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:39 crc kubenswrapper[4941]: E0307 06:53:39.954940 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:39 crc kubenswrapper[4941]: E0307 06:53:39.955171 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.003453 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.003505 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.003515 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.003533 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.003544 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:40Z","lastTransitionTime":"2026-03-07T06:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.106934 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.106984 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.106997 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.107016 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.107030 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:40Z","lastTransitionTime":"2026-03-07T06:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.210244 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.210319 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.210333 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.210354 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.210368 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:40Z","lastTransitionTime":"2026-03-07T06:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.313679 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.313750 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.313780 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.313805 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.313824 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:40Z","lastTransitionTime":"2026-03-07T06:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.416492 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.416538 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.416552 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.416568 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.416591 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:40Z","lastTransitionTime":"2026-03-07T06:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.519942 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.520031 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.520056 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.520090 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.520114 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:40Z","lastTransitionTime":"2026-03-07T06:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.623384 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.623563 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.623588 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.623622 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.623643 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:40Z","lastTransitionTime":"2026-03-07T06:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.726262 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.726313 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.726324 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.726346 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.726360 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:40Z","lastTransitionTime":"2026-03-07T06:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.829735 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.829801 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.829816 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.829837 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.829850 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:40Z","lastTransitionTime":"2026-03-07T06:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.933215 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.933279 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.933293 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.933311 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.933323 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:40Z","lastTransitionTime":"2026-03-07T06:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:40 crc kubenswrapper[4941]: I0307 06:53:40.953916 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:40 crc kubenswrapper[4941]: E0307 06:53:40.954164 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.036564 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.036608 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.036616 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.036634 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.036645 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:41Z","lastTransitionTime":"2026-03-07T06:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.139747 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.139817 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.139839 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.139865 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.139880 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:41Z","lastTransitionTime":"2026-03-07T06:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.242657 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.242714 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.242729 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.242754 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.242771 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:41Z","lastTransitionTime":"2026-03-07T06:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.345299 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.345354 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.345364 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.345384 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.345395 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:41Z","lastTransitionTime":"2026-03-07T06:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.449024 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.449079 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.449093 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.449111 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.449125 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:41Z","lastTransitionTime":"2026-03-07T06:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.551582 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.551886 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.551902 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.551928 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.551942 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:41Z","lastTransitionTime":"2026-03-07T06:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.655565 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.655645 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.655673 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.655739 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.655765 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:41Z","lastTransitionTime":"2026-03-07T06:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.759624 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.759671 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.759682 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.759696 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.759706 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:41Z","lastTransitionTime":"2026-03-07T06:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.856505 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.856685 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:54:13.856651949 +0000 UTC m=+150.809017414 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.856745 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.856802 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.856919 4941 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.856977 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:54:13.856962257 +0000 UTC m=+150.809327722 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.856992 4941 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.857124 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:54:13.85708891 +0000 UTC m=+150.809454405 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.862338 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.862381 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.862391 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.862419 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.862431 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:41Z","lastTransitionTime":"2026-03-07T06:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.954497 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.954507 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.955152 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.954525 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.955243 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.955334 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.958050 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.958107 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.958141 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.958173 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.958195 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.958208 4941 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.958255 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.958263 4941 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.958277 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.958289 4941 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.958259 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 06:54:13.958242682 +0000 UTC m=+150.910608147 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.958319 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs podName:80030e60-caa3-4aad-8b00-10f5143d9243 nodeName:}" failed. No retries permitted until 2026-03-07 06:54:13.958306963 +0000 UTC m=+150.910672428 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs") pod "network-metrics-daemon-q9fpr" (UID: "80030e60-caa3-4aad-8b00-10f5143d9243") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:53:41 crc kubenswrapper[4941]: E0307 06:53:41.958338 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 06:54:13.958331004 +0000 UTC m=+150.910696469 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.965034 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.965079 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.965091 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.965108 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:41 crc kubenswrapper[4941]: I0307 06:53:41.965121 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:41Z","lastTransitionTime":"2026-03-07T06:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.067291 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.067341 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.067355 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.067374 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.067390 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:42Z","lastTransitionTime":"2026-03-07T06:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.170818 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.170862 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.170873 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.170887 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.170897 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:42Z","lastTransitionTime":"2026-03-07T06:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.273703 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.273764 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.273780 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.273801 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.273814 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:42Z","lastTransitionTime":"2026-03-07T06:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.375902 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.375968 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.375981 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.375998 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.376010 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:42Z","lastTransitionTime":"2026-03-07T06:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.479560 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.479601 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.479612 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.479630 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.479640 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:42Z","lastTransitionTime":"2026-03-07T06:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.582880 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.582945 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.582962 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.582991 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.583011 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:42Z","lastTransitionTime":"2026-03-07T06:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.685574 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.685635 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.685647 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.685665 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.685713 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:42Z","lastTransitionTime":"2026-03-07T06:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.788816 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.788875 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.788893 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.788920 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.788939 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:42Z","lastTransitionTime":"2026-03-07T06:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.891678 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.891734 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.891747 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.891766 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.891779 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:42Z","lastTransitionTime":"2026-03-07T06:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.953747 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:42 crc kubenswrapper[4941]: E0307 06:53:42.953978 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.995095 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.995158 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.995172 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.995193 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:42 crc kubenswrapper[4941]: I0307 06:53:42.995208 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:42Z","lastTransitionTime":"2026-03-07T06:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.099248 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.099316 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.099333 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.099366 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.099385 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:43Z","lastTransitionTime":"2026-03-07T06:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.202537 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.202864 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.202883 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.202911 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.202930 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:43Z","lastTransitionTime":"2026-03-07T06:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.306348 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.306439 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.306458 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.306484 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.306502 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:43Z","lastTransitionTime":"2026-03-07T06:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.409890 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.409953 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.409969 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.409995 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.410017 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:43Z","lastTransitionTime":"2026-03-07T06:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.512459 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.512513 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.512528 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.512548 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.512566 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:43Z","lastTransitionTime":"2026-03-07T06:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.616272 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.616361 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.616394 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.616461 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.616481 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:43Z","lastTransitionTime":"2026-03-07T06:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.718749 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.718799 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.718812 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.718830 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.718843 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:43Z","lastTransitionTime":"2026-03-07T06:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.821663 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.821740 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.821782 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.821810 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.821829 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:43Z","lastTransitionTime":"2026-03-07T06:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:43 crc kubenswrapper[4941]: E0307 06:53:43.922475 4941 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.956009 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:43 crc kubenswrapper[4941]: E0307 06:53:43.956235 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.956377 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:43 crc kubenswrapper[4941]: E0307 06:53:43.956613 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.956676 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:43 crc kubenswrapper[4941]: E0307 06:53:43.956860 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.977744 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:43Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:43 crc kubenswrapper[4941]: I0307 06:53:43.991744 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:43Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.010192 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.021330 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.038133 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.052253 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.066993 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: E0307 06:53:44.074428 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.098161 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:36Z\\\",\\\"message\\\":\\\"_uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 06:53:36.773003 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z]\\\\nI0307 06:53:36.773779 7105 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_sn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.141083 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.166341 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.181758 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.193888 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.206247 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.217536 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.229321 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.249799 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.265766 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:44Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:44 crc kubenswrapper[4941]: I0307 06:53:44.954384 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:44 crc kubenswrapper[4941]: E0307 06:53:44.954645 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.673848 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.673951 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.673978 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.674014 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.674042 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:45Z","lastTransitionTime":"2026-03-07T06:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:45 crc kubenswrapper[4941]: E0307 06:53:45.697352 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:45Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.703227 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.703282 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.703305 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.703335 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.703357 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:45Z","lastTransitionTime":"2026-03-07T06:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:45 crc kubenswrapper[4941]: E0307 06:53:45.725816 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:45Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.731044 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.731080 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.731092 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.731109 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.731120 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:45Z","lastTransitionTime":"2026-03-07T06:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:45 crc kubenswrapper[4941]: E0307 06:53:45.747965 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:45Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.751920 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.751947 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.751957 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.751971 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.751981 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:45Z","lastTransitionTime":"2026-03-07T06:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:45 crc kubenswrapper[4941]: E0307 06:53:45.767020 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:45Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.771302 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.771339 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.771350 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.771366 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.771378 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:45Z","lastTransitionTime":"2026-03-07T06:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:45 crc kubenswrapper[4941]: E0307 06:53:45.790129 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:45Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:45 crc kubenswrapper[4941]: E0307 06:53:45.790442 4941 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.953913 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.954610 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:45 crc kubenswrapper[4941]: E0307 06:53:45.954872 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:45 crc kubenswrapper[4941]: I0307 06:53:45.954911 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:45 crc kubenswrapper[4941]: E0307 06:53:45.955729 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:45 crc kubenswrapper[4941]: E0307 06:53:45.955927 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:46 crc kubenswrapper[4941]: I0307 06:53:46.954307 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:46 crc kubenswrapper[4941]: E0307 06:53:46.954620 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:47 crc kubenswrapper[4941]: I0307 06:53:47.954013 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:47 crc kubenswrapper[4941]: I0307 06:53:47.954051 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:47 crc kubenswrapper[4941]: E0307 06:53:47.954202 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:47 crc kubenswrapper[4941]: I0307 06:53:47.954220 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:47 crc kubenswrapper[4941]: E0307 06:53:47.954329 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:47 crc kubenswrapper[4941]: E0307 06:53:47.954442 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:48 crc kubenswrapper[4941]: I0307 06:53:48.954573 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:48 crc kubenswrapper[4941]: E0307 06:53:48.954744 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:49 crc kubenswrapper[4941]: E0307 06:53:49.076063 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:53:49 crc kubenswrapper[4941]: I0307 06:53:49.954432 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:49 crc kubenswrapper[4941]: I0307 06:53:49.954460 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:49 crc kubenswrapper[4941]: I0307 06:53:49.954654 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:49 crc kubenswrapper[4941]: E0307 06:53:49.954816 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:49 crc kubenswrapper[4941]: E0307 06:53:49.954933 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:49 crc kubenswrapper[4941]: E0307 06:53:49.955036 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:49 crc kubenswrapper[4941]: I0307 06:53:49.972313 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 07 06:53:50 crc kubenswrapper[4941]: I0307 06:53:50.954163 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:50 crc kubenswrapper[4941]: E0307 06:53:50.954395 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:51 crc kubenswrapper[4941]: I0307 06:53:51.953987 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:51 crc kubenswrapper[4941]: I0307 06:53:51.954070 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:51 crc kubenswrapper[4941]: E0307 06:53:51.954214 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:51 crc kubenswrapper[4941]: I0307 06:53:51.954014 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:51 crc kubenswrapper[4941]: E0307 06:53:51.954690 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:51 crc kubenswrapper[4941]: E0307 06:53:51.955050 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:51 crc kubenswrapper[4941]: I0307 06:53:51.956381 4941 scope.go:117] "RemoveContainer" containerID="954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613" Mar 07 06:53:51 crc kubenswrapper[4941]: E0307 06:53:51.956648 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" Mar 07 06:53:52 crc kubenswrapper[4941]: I0307 06:53:52.953692 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:52 crc kubenswrapper[4941]: E0307 06:53:52.954385 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.728515 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.750655 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d286d622-e719-4784-b799-4dd404e59819\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4f03c38ce90dccca05dd6e447d9edc485d4c85b0aa2f3206a561868bf7c66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d58b1c0b016385b1f6474ecda4ac3f92786ad41f9fc2d2f8f4af0ff418608a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b6f4484dc1d227d69c21f0d44cda7a24648e102571804f512428de36b1d53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.776092 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.798548 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.816063 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.842863 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.860917 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.874010 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.885341 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.898484 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.908629 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.930729 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.945288 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.954553 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.954620 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.954628 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:53 crc kubenswrapper[4941]: E0307 06:53:53.954745 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:53 crc kubenswrapper[4941]: E0307 06:53:53.954855 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:53 crc kubenswrapper[4941]: E0307 06:53:53.954961 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.961146 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.981549 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:36Z\\\",\\\"message\\\":\\\"_uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 06:53:36.773003 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z]\\\\nI0307 06:53:36.773779 7105 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_sn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:53 crc kubenswrapper[4941]: I0307 06:53:53.992505 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:53Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.003645 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.016534 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.027763 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.039028 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d286d622-e719-4784-b799-4dd404e59819\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4f03c38ce90dccca05dd6e447d9edc485d4c85b0aa2f3206a561868bf7c66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d58b1c0b016385b1f6474ecda4ac3f92786ad41f9fc2d2f8f4af0ff418608a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b6f4484dc1d227d69c21f0d44cda7a24648e102571804f512428de36b1d53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.053284 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.072705 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: E0307 06:53:54.076609 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.089374 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.116063 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.132237 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.145800 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.159647 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.182224 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.194862 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.211752 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.224227 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.234968 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.252331 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:36Z\\\",\\\"message\\\":\\\"_uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 06:53:36.773003 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z]\\\\nI0307 06:53:36.773779 7105 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_sn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.264906 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.281370 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.305614 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.320933 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:54Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:54 crc kubenswrapper[4941]: I0307 06:53:54.954657 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:54 crc kubenswrapper[4941]: E0307 06:53:54.954932 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:55 crc kubenswrapper[4941]: I0307 06:53:55.954246 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:55 crc kubenswrapper[4941]: I0307 06:53:55.954246 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:55 crc kubenswrapper[4941]: E0307 06:53:55.954443 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:55 crc kubenswrapper[4941]: E0307 06:53:55.954606 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:55 crc kubenswrapper[4941]: I0307 06:53:55.954583 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:55 crc kubenswrapper[4941]: E0307 06:53:55.954783 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:55 crc kubenswrapper[4941]: I0307 06:53:55.969236 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:55 crc kubenswrapper[4941]: I0307 06:53:55.969296 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:55 crc kubenswrapper[4941]: I0307 06:53:55.969314 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:55 crc kubenswrapper[4941]: I0307 06:53:55.969334 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:55 crc kubenswrapper[4941]: I0307 06:53:55.969352 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:55Z","lastTransitionTime":"2026-03-07T06:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:55 crc kubenswrapper[4941]: E0307 06:53:55.989140 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:55Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:55 crc kubenswrapper[4941]: I0307 06:53:55.994282 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:55 crc kubenswrapper[4941]: I0307 06:53:55.994361 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:55 crc kubenswrapper[4941]: I0307 06:53:55.994379 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:55 crc kubenswrapper[4941]: I0307 06:53:55.994432 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:55 crc kubenswrapper[4941]: I0307 06:53:55.994454 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:55Z","lastTransitionTime":"2026-03-07T06:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:56 crc kubenswrapper[4941]: E0307 06:53:56.015354 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:56Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.019753 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.019854 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.019884 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.019930 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.019957 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:56Z","lastTransitionTime":"2026-03-07T06:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:56 crc kubenswrapper[4941]: E0307 06:53:56.037941 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:56Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.041847 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.041903 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.041921 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.041947 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.041965 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:56Z","lastTransitionTime":"2026-03-07T06:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:56 crc kubenswrapper[4941]: E0307 06:53:56.056986 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:56Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.061602 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.061652 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.061665 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.061683 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.061695 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:53:56Z","lastTransitionTime":"2026-03-07T06:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:53:56 crc kubenswrapper[4941]: E0307 06:53:56.077654 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:56Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:56 crc kubenswrapper[4941]: E0307 06:53:56.077788 4941 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 06:53:56 crc kubenswrapper[4941]: I0307 06:53:56.954285 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:56 crc kubenswrapper[4941]: E0307 06:53:56.954587 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.740594 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kc9rw_ed82bc0c-1609-449c-b2e2-2fe04af9749d/kube-multus/0.log" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.740679 4941 generic.go:334] "Generic (PLEG): container finished" podID="ed82bc0c-1609-449c-b2e2-2fe04af9749d" containerID="9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7" exitCode=1 Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.740730 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kc9rw" event={"ID":"ed82bc0c-1609-449c-b2e2-2fe04af9749d","Type":"ContainerDied","Data":"9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7"} Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.741226 4941 scope.go:117] "RemoveContainer" containerID="9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.760531 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.787596 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.804047 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.816392 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d286d622-e719-4784-b799-4dd404e59819\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4f03c38ce90dccca05dd6e447d9edc485d4c85b0aa2f3206a561868bf7c66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d58b1c0b016385b1f6474ecda4ac3f92786ad41f9fc2d2f8f4af0ff418608a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b6f4484dc1d227d69c21f0d44cda7a24648e102571804f512428de36b1d53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.830866 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.843254 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.855275 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.868653 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.882137 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.891804 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.905617 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.915571 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.927713 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.941052 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:57Z\\\",\\\"message\\\":\\\"2026-03-07T06:53:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2483e3f0-97e5-42c9-b914-d7152d964ce7\\\\n2026-03-07T06:53:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2483e3f0-97e5-42c9-b914-d7152d964ce7 to /host/opt/cni/bin/\\\\n2026-03-07T06:53:12Z [verbose] multus-daemon started\\\\n2026-03-07T06:53:12Z [verbose] Readiness Indicator file check\\\\n2026-03-07T06:53:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.953983 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.954006 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.953994 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:57 crc kubenswrapper[4941]: E0307 06:53:57.954151 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:53:57 crc kubenswrapper[4941]: E0307 06:53:57.954254 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:57 crc kubenswrapper[4941]: E0307 06:53:57.954363 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.954512 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.970253 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:57 crc kubenswrapper[4941]: I0307 06:53:57.997162 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:36Z\\\",\\\"message\\\":\\\"_uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 06:53:36.773003 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z]\\\\nI0307 06:53:36.773779 7105 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_sn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:57Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.012025 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.748757 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kc9rw_ed82bc0c-1609-449c-b2e2-2fe04af9749d/kube-multus/0.log" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.748851 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kc9rw" event={"ID":"ed82bc0c-1609-449c-b2e2-2fe04af9749d","Type":"ContainerStarted","Data":"9c8147115ce051f627cdbae790a127df650235a4ea68236d79581d1848d46261"} Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.770300 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.787727 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8147115ce051f627cdbae790a127df650235a4ea68236d79581d1848d46261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:57Z\\\",\\\"message\\\":\\\"2026-03-07T06:53:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2483e3f0-97e5-42c9-b914-d7152d964ce7\\\\n2026-03-07T06:53:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2483e3f0-97e5-42c9-b914-d7152d964ce7 to /host/opt/cni/bin/\\\\n2026-03-07T06:53:12Z [verbose] multus-daemon started\\\\n2026-03-07T06:53:12Z [verbose] Readiness Indicator file check\\\\n2026-03-07T06:53:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.798081 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.809115 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.827049 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:36Z\\\",\\\"message\\\":\\\"_uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 06:53:36.773003 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z]\\\\nI0307 06:53:36.773779 7105 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_sn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.842225 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.855023 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.884056 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.899221 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.920514 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d286d622-e719-4784-b799-4dd404e59819\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4f03c38ce90dccca05dd6e447d9edc485d4c85b0aa2f3206a561868bf7c66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d58b1c0b016385b1f6474ecda4ac3f92786ad41f9fc2d2f8f4af0ff418608a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b6f4484dc1d227d69c21f0d44cda7a24648e102571804f512428de36b1d53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.936506 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.953541 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.953511 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: E0307 06:53:58.953691 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.968210 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:58 crc kubenswrapper[4941]: I0307 06:53:58.984452 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:59 crc kubenswrapper[4941]: I0307 06:53:59.000019 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:58Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:59 crc kubenswrapper[4941]: I0307 06:53:59.012387 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:59Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:59 crc kubenswrapper[4941]: I0307 06:53:59.030479 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:59Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:59 crc kubenswrapper[4941]: I0307 06:53:59.045455 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:59Z is after 2025-08-24T17:21:41Z" Mar 07 06:53:59 crc kubenswrapper[4941]: E0307 06:53:59.078678 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:53:59 crc kubenswrapper[4941]: I0307 06:53:59.953632 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:53:59 crc kubenswrapper[4941]: I0307 06:53:59.953687 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:53:59 crc kubenswrapper[4941]: I0307 06:53:59.953728 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:53:59 crc kubenswrapper[4941]: E0307 06:53:59.953808 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:53:59 crc kubenswrapper[4941]: E0307 06:53:59.953914 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:53:59 crc kubenswrapper[4941]: E0307 06:53:59.954069 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:00 crc kubenswrapper[4941]: I0307 06:54:00.953713 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:00 crc kubenswrapper[4941]: E0307 06:54:00.953955 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:01 crc kubenswrapper[4941]: I0307 06:54:01.954661 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:01 crc kubenswrapper[4941]: I0307 06:54:01.954714 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:01 crc kubenswrapper[4941]: I0307 06:54:01.954658 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:01 crc kubenswrapper[4941]: E0307 06:54:01.954844 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:01 crc kubenswrapper[4941]: E0307 06:54:01.955006 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:01 crc kubenswrapper[4941]: E0307 06:54:01.955152 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:02 crc kubenswrapper[4941]: I0307 06:54:02.954297 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:02 crc kubenswrapper[4941]: E0307 06:54:02.954601 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:03 crc kubenswrapper[4941]: I0307 06:54:03.953809 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:03 crc kubenswrapper[4941]: I0307 06:54:03.954069 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:03 crc kubenswrapper[4941]: I0307 06:54:03.954074 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:03 crc kubenswrapper[4941]: E0307 06:54:03.954394 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:03 crc kubenswrapper[4941]: E0307 06:54:03.954736 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:03 crc kubenswrapper[4941]: E0307 06:54:03.955007 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:03 crc kubenswrapper[4941]: I0307 06:54:03.970815 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:03Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:03 crc kubenswrapper[4941]: I0307 06:54:03.990262 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8147115ce051f627cdbae790a127df650235a4ea68236d79581d1848d46261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:57Z\\\",\\\"message\\\":\\\"2026-03-07T06:53:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2483e3f0-97e5-42c9-b914-d7152d964ce7\\\\n2026-03-07T06:53:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2483e3f0-97e5-42c9-b914-d7152d964ce7 to /host/opt/cni/bin/\\\\n2026-03-07T06:53:12Z [verbose] multus-daemon started\\\\n2026-03-07T06:53:12Z [verbose] Readiness Indicator file check\\\\n2026-03-07T06:53:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:03Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.008249 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.029870 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.054779 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:36Z\\\",\\\"message\\\":\\\"_uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 06:53:36.773003 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z]\\\\nI0307 06:53:36.773779 7105 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_sn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.070459 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: E0307 06:54:04.079215 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.091310 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.114379 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.128817 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.139879 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d286d622-e719-4784-b799-4dd404e59819\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4f03c38ce90dccca05dd6e447d9edc485d4c85b0aa2f3206a561868bf7c66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d58b1c0b016385b1f6474ecda4ac3f92786ad41f9fc2d2f8f4af0ff418608a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b6f4484dc1d227d69c21f0d44cda7a24648e102571804f512428de36b1d53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.153759 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.165786 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.178697 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.190367 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.201661 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.210288 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.222130 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.230185 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:04Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.953705 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:04 crc kubenswrapper[4941]: E0307 06:54:04.954278 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:04 crc kubenswrapper[4941]: I0307 06:54:04.954645 4941 scope.go:117] "RemoveContainer" containerID="954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.777343 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/2.log" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.780135 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerStarted","Data":"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68"} Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.780683 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.792538 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:05Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.816907 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:05Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.837424 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:05Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.861459 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d286d622-e719-4784-b799-4dd404e59819\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4f03c38ce90dccca05dd6e447d9edc485d4c85b0aa2f3206a561868bf7c66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d58b1c0b016385b1f6474ecda4ac3f92786ad41f9fc2d2f8f4af0ff418608a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b6f4484dc1d227d69c21f0d44cda7a24648e102571804f512428de36b1d53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:05Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.883761 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:05Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.898480 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:05Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.911134 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:05Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.925657 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:05Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.939962 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:05Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.952452 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:05Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.953783 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.953817 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:05 crc kubenswrapper[4941]: E0307 06:54:05.953950 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.954155 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:05 crc kubenswrapper[4941]: E0307 06:54:05.954221 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:05 crc kubenswrapper[4941]: E0307 06:54:05.954353 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.973997 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:05Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:05 crc kubenswrapper[4941]: I0307 06:54:05.987290 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:05Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.001847 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:05Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.016055 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8147115ce051f627cdbae790a127df650235a4ea68236d79581d1848d46261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:57Z\\\",\\\"message\\\":\\\"2026-03-07T06:53:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2483e3f0-97e5-42c9-b914-d7152d964ce7\\\\n2026-03-07T06:53:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2483e3f0-97e5-42c9-b914-d7152d964ce7 to /host/opt/cni/bin/\\\\n2026-03-07T06:53:12Z [verbose] multus-daemon started\\\\n2026-03-07T06:53:12Z [verbose] Readiness Indicator file check\\\\n2026-03-07T06:53:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.033220 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.045532 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.064667 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:36Z\\\",\\\"message\\\":\\\"_uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 06:53:36.773003 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z]\\\\nI0307 06:53:36.773779 7105 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_sn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.075225 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.250032 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.250082 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.250092 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.250108 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.250119 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:54:06Z","lastTransitionTime":"2026-03-07T06:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:54:06 crc kubenswrapper[4941]: E0307 06:54:06.266827 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.271487 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.271525 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.271538 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.271573 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.271586 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:54:06Z","lastTransitionTime":"2026-03-07T06:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:54:06 crc kubenswrapper[4941]: E0307 06:54:06.285433 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.289377 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.289474 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.289494 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.289516 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.289532 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:54:06Z","lastTransitionTime":"2026-03-07T06:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:54:06 crc kubenswrapper[4941]: E0307 06:54:06.302278 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.306553 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.306603 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.306618 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.306639 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.306655 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:54:06Z","lastTransitionTime":"2026-03-07T06:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:54:06 crc kubenswrapper[4941]: E0307 06:54:06.319507 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.323537 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.323576 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.323588 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.323607 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.323620 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:54:06Z","lastTransitionTime":"2026-03-07T06:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:54:06 crc kubenswrapper[4941]: E0307 06:54:06.339269 4941 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T06:54:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0215de21-ea41-473f-b375-94c4974c5b21\\\",\\\"systemUUID\\\":\\\"d663d73a-c5af-4e4f-81fe-9bf574386cbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: E0307 06:54:06.339384 4941 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.786732 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/3.log" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.787592 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/2.log" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.790635 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3469f59-621c-4493-ade3-768772d05ebd" containerID="278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68" exitCode=1 Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.790675 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68"} Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.790712 4941 scope.go:117] "RemoveContainer" containerID="954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.791498 4941 scope.go:117] "RemoveContainer" containerID="278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68" Mar 07 06:54:06 crc kubenswrapper[4941]: E0307 06:54:06.791700 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.804586 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.819097 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.836024 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.848480 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.861392 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.874119 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.885814 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.911268 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://954ef199f279c51f468c031110aa212f8944bc8a56dc482b6df9db0683aab613\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:36Z\\\",\\\"message\\\":\\\"_uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 06:53:36.773003 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:53:36Z is after 2025-08-24T17:21:41Z]\\\\nI0307 06:53:36.773779 7105 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_sn\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:54:05Z\\\",\\\"message\\\":\\\"orkPolicy event handler 4 for removal\\\\nI0307 06:54:05.953215 7437 factory.go:656] Stopping watch factory\\\\nI0307 06:54:05.953235 7437 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0307 06:54:05.953236 7437 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 06:54:05.951560 7437 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 06:54:05.953246 7437 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0307 06:54:05.953259 7437 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 06:54:05.953271 7437 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 06:54:05.953282 7437 handler.go:208] Removed *v1.Node event handler 7\\\\nI0307 06:54:05.953288 7437 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0307 06:54:05.951623 7437 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 06:54:05.951652 7437 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 06:54:05.951511 7437 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.924914 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.936652 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.953613 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:06 crc kubenswrapper[4941]: E0307 06:54:06.953790 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.954751 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8147115ce051f627cdbae790a127df650235a4ea68236d79581d1848d46261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:57Z\\\",\\\"message\\\":\\\"2026-03-07T06:53:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2483e3f0-97e5-42c9-b914-d7152d964ce7\\\\n2026-03-07T06:53:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2483e3f0-97e5-42c9-b914-d7152d964ce7 to /host/opt/cni/bin/\\\\n2026-03-07T06:53:12Z [verbose] multus-daemon started\\\\n2026-03-07T06:53:12Z [verbose] Readiness Indicator file check\\\\n2026-03-07T06:53:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.971641 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:06 crc kubenswrapper[4941]: I0307 06:54:06.986100 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d286d622-e719-4784-b799-4dd404e59819\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4f03c38ce90dccca05dd6e447d9edc485d4c85b0aa2f3206a561868bf7c66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d58b1c0b016385b1f6474ecda4ac3f92786ad41f9fc2d2f8f4af0ff418608a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b6f4484dc1d227d69c21f0d44cda7a24648e102571804f512428de36b1d53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.000896 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:06Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.015081 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.028565 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.051870 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.066246 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.797383 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/3.log" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.801154 4941 scope.go:117] "RemoveContainer" containerID="278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68" Mar 07 06:54:07 crc kubenswrapper[4941]: E0307 06:54:07.801309 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.815365 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.831717 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.845263 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.867181 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.880354 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.899296 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kc9rw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed82bc0c-1609-449c-b2e2-2fe04af9749d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8147115ce051f627cdbae790a127df650235a4ea68236d79581d1848d46261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:53:57Z\\\",\\\"message\\\":\\\"2026-03-07T06:53:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2483e3f0-97e5-42c9-b914-d7152d964ce7\\\\n2026-03-07T06:53:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2483e3f0-97e5-42c9-b914-d7152d964ce7 to /host/opt/cni/bin/\\\\n2026-03-07T06:53:12Z [verbose] multus-daemon started\\\\n2026-03-07T06:53:12Z [verbose] Readiness Indicator file check\\\\n2026-03-07T06:53:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rxsbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kc9rw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.913919 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.927137 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7630b2c1eaa53f6c866b1da4d5818360f29ff6bf5dace90d072cddac793ddb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.954392 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.954454 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.954442 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:07 crc kubenswrapper[4941]: E0307 06:54:07.954627 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:07 crc kubenswrapper[4941]: E0307 06:54:07.954701 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:07 crc kubenswrapper[4941]: E0307 06:54:07.954829 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.959327 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3469f59-621c-4493-ade3-768772d05ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T06:54:05Z\\\",\\\"message\\\":\\\"orkPolicy event handler 4 for removal\\\\nI0307 06:54:05.953215 7437 factory.go:656] Stopping watch factory\\\\nI0307 06:54:05.953235 7437 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0307 06:54:05.953236 7437 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 06:54:05.951560 7437 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 06:54:05.953246 7437 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0307 06:54:05.953259 7437 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 06:54:05.953271 7437 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 06:54:05.953282 7437 handler.go:208] Removed *v1.Node event handler 7\\\\nI0307 06:54:05.953288 7437 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0307 06:54:05.951623 7437 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 06:54:05.951652 7437 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0307 06:54:05.951511 7437 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:54:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcp6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5ztp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.973681 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:07 crc kubenswrapper[4941]: I0307 06:54:07.988276 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf1fc89-d66c-4cd5-b2ea-9537627bdf39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc59355f96a4342e0c8ec7554397fc53402f455c9f3451697587433cade96d02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2812ff1627396d1f8df39b1b8edb6f7cf6ae3eb7608f60f9b9df4480215f1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dh96x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-w48fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:07Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:08 crc kubenswrapper[4941]: I0307 06:54:08.007168 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97e6515-2f01-45bf-b366-0c5a30e87a76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7951b7b496ade64ac231e5f27153b7fe68454e871b06a02e6043f97f80d3fcee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc4408c28612093a98f6505e7b058c66c91efeae82a8ca23d46b05d6b94acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a685d39bf8ce384cd27ced3517fe90b8b6ff24ae6ce377c69b8b55bed1c5a24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b0e31983585bd1b64249fc864fe39c95fd7790026242972fe742052360f1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b0ed8591922bda8d8e43239c2e4ea5123affe204adff18ec7a410095c56d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2ddae5f4f91631ea822554910b0c21fc63b9c632894ac89a9b54f8f14a8faba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc9c6a1d39904d028db419b348f1d0d7b740c8686aa221365044476b1a6e9863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44fce5e6cb133478a935e953d4ad21eb7512959784885fc300fad9b69c9772b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:08Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:08 crc kubenswrapper[4941]: I0307 06:54:08.020958 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 06:52:49.460652 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 06:52:49.460863 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 06:52:49.462109 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2106164525/tls.crt::/tmp/serving-cert-2106164525/tls.key\\\\\\\"\\\\nI0307 06:52:49.937489 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 06:52:49.939239 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 06:52:49.939258 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 06:52:49.939282 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 06:52:49.939287 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 06:52:49.943291 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 06:52:49.943327 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0307 06:52:49.943323 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0307 06:52:49.943335 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 06:52:49.943386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 06:52:49.943390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 06:52:49.943394 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 06:52:49.943403 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0307 06:52:49.946084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:52:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:08Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:08 crc kubenswrapper[4941]: I0307 06:54:08.034445 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d286d622-e719-4784-b799-4dd404e59819\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4f03c38ce90dccca05dd6e447d9edc485d4c85b0aa2f3206a561868bf7c66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d58b1c0b016385b1f6474ecda4ac3f92786ad41f9fc2d2f8f4af0ff418608a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b6f4484dc1d227d69c21f0d44cda7a24648e102571804f512428de36b1d53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02f9486703fedfe863bebf365dfb615c9b972a637b4660a33d8041c5691d4143\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:08Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:08 crc kubenswrapper[4941]: I0307 06:54:08.048373 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:08Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:08 crc kubenswrapper[4941]: I0307 06:54:08.062619 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75f9eefee89feb694991dadc343343984d9c000a4df5b2b57e9bb9413966cb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fae4c3e26356bb548ed7d06dd72d54e7619f5a7fcb4aac2abe0c67c4cacb74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:08Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:08 crc kubenswrapper[4941]: I0307 06:54:08.075742 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80030e60-caa3-4aad-8b00-10f5143d9243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sn24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q9fpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:08Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:08 crc kubenswrapper[4941]: I0307 06:54:08.093730 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a72f9b9913fcc5df2e4eca561de1ca8f0b3d5236833e1e2210c880782149c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:08Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:08 crc kubenswrapper[4941]: I0307 06:54:08.954593 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:08 crc kubenswrapper[4941]: E0307 06:54:08.954853 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:09 crc kubenswrapper[4941]: E0307 06:54:09.081037 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:54:09 crc kubenswrapper[4941]: I0307 06:54:09.954447 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:09 crc kubenswrapper[4941]: I0307 06:54:09.954459 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:09 crc kubenswrapper[4941]: I0307 06:54:09.954459 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:09 crc kubenswrapper[4941]: E0307 06:54:09.955939 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:09 crc kubenswrapper[4941]: E0307 06:54:09.956038 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:09 crc kubenswrapper[4941]: E0307 06:54:09.956093 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:10 crc kubenswrapper[4941]: I0307 06:54:10.953945 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:10 crc kubenswrapper[4941]: E0307 06:54:10.954299 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:11 crc kubenswrapper[4941]: I0307 06:54:11.954374 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:11 crc kubenswrapper[4941]: E0307 06:54:11.954667 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:11 crc kubenswrapper[4941]: I0307 06:54:11.954777 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:11 crc kubenswrapper[4941]: I0307 06:54:11.954777 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:11 crc kubenswrapper[4941]: E0307 06:54:11.955082 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:11 crc kubenswrapper[4941]: E0307 06:54:11.954956 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:12 crc kubenswrapper[4941]: I0307 06:54:12.953648 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:12 crc kubenswrapper[4941]: E0307 06:54:12.953817 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:13 crc kubenswrapper[4941]: I0307 06:54:13.921210 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:54:13 crc kubenswrapper[4941]: E0307 06:54:13.921368 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.921344829 +0000 UTC m=+214.873710304 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:54:13 crc kubenswrapper[4941]: I0307 06:54:13.921735 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:13 crc kubenswrapper[4941]: I0307 06:54:13.921795 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:13 crc kubenswrapper[4941]: E0307 06:54:13.921879 4941 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:54:13 crc kubenswrapper[4941]: E0307 06:54:13.921889 4941 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:54:13 crc kubenswrapper[4941]: E0307 06:54:13.921934 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.921918345 +0000 UTC m=+214.874283810 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 06:54:13 crc kubenswrapper[4941]: E0307 06:54:13.921951 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.921944876 +0000 UTC m=+214.874310341 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 06:54:13 crc kubenswrapper[4941]: I0307 06:54:13.954352 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:13 crc kubenswrapper[4941]: I0307 06:54:13.954396 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:13 crc kubenswrapper[4941]: I0307 06:54:13.954352 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:13 crc kubenswrapper[4941]: E0307 06:54:13.954557 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:13 crc kubenswrapper[4941]: E0307 06:54:13.954781 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:13 crc kubenswrapper[4941]: E0307 06:54:13.954865 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:13 crc kubenswrapper[4941]: I0307 06:54:13.968054 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lv4jp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dd327c3-0368-47cd-87cb-d972354bedee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29aa791fbdd9d038f2d6e6ff75c9a720cd0ef005179e18a8e764d0c6601cfb6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x2jb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lv4jp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:13 crc kubenswrapper[4941]: I0307 06:54:13.981814 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a5ab23-f96b-4c8e-8c7f-41972f6c41be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e2485bb4d3c244b1c4ede75452ba2bae59191f46c5022a9909ca6049b670be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0839dbf364ac85abd880f527b2f689bf31973d329e4327968b6fb3d04038c990\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T06:52:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 06:51:46.084831 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 06:51:46.088098 1 observer_polling.go:159] Starting file observer\\\\nI0307 06:51:46.120572 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 06:51:46.124621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 06:52:12.800824 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0307 06:52:12.800914 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8362f2e68c454f2c4707ccb4cc70f9bbc42897f21666b42f004d5baff61ed02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa9ad05c65e7fd6eda405251f418332df9ec913fefa1b95affea43db7901548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:51:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:13 crc kubenswrapper[4941]: I0307 06:54:13.992783 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:13Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:14 crc kubenswrapper[4941]: I0307 06:54:14.002800 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vm6ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8177e9-cc72-474e-95fa-b9d3539f4ad7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64e83c03802daaa11d3d07537a4ce8e02c2e4d0e3b424e33d25b530931f23a48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4x72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vm6ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:14 crc kubenswrapper[4941]: I0307 06:54:14.016147 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf634026-9cb8-4afa-ad8c-e4f119f04899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd33d3ffcdb1768d9ff5d6e8e73c97f4e8dfa1ccf76e31ba4dd3f4985a47e981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86215cb2ddd13facb5e9c4d80a896e60b68f8a6d74b4c306c5fe86ece69e3955\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d839514a481f93a94ddfb255592f4750bafb955082abc523ddf0e685f2b0159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca20d5dda2db947af29cc6f29bb9adb727fcf7a2404116873d5fbea95989562\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83dc47fa00976d0a194df8ec0dc0c1a596d178d1a017b5b738c5006cf48cf66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddf3e8a4e7186bebb5a1fe8ea6445e008779ccefc7ff23d3c39bb56d19cc6305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5cf21432595c7a13eadd4cc9f8233dfcf8ce35f17218713a1c132bac0dbfc0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:53:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qs9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-q9xqh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:14 crc kubenswrapper[4941]: I0307 06:54:14.023049 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:14 crc kubenswrapper[4941]: I0307 06:54:14.023110 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:14 crc kubenswrapper[4941]: I0307 06:54:14.023139 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:14 crc kubenswrapper[4941]: E0307 06:54:14.023653 4941 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:54:14 crc kubenswrapper[4941]: E0307 06:54:14.023782 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs podName:80030e60-caa3-4aad-8b00-10f5143d9243 nodeName:}" failed. No retries permitted until 2026-03-07 06:55:18.023752874 +0000 UTC m=+214.976118369 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs") pod "network-metrics-daemon-q9fpr" (UID: "80030e60-caa3-4aad-8b00-10f5143d9243") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 06:54:14 crc kubenswrapper[4941]: E0307 06:54:14.023667 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:54:14 crc kubenswrapper[4941]: E0307 06:54:14.023842 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:54:14 crc kubenswrapper[4941]: E0307 06:54:14.023868 4941 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:54:14 crc kubenswrapper[4941]: E0307 06:54:14.023682 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 06:54:14 crc kubenswrapper[4941]: E0307 06:54:14.023936 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 06:55:18.023916519 +0000 UTC m=+214.976282024 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:54:14 crc kubenswrapper[4941]: E0307 06:54:14.023972 4941 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 06:54:14 crc kubenswrapper[4941]: E0307 06:54:14.023996 4941 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:54:14 crc kubenswrapper[4941]: E0307 06:54:14.024057 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 06:55:18.024036452 +0000 UTC m=+214.976401917 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 06:54:14 crc kubenswrapper[4941]: I0307 06:54:14.027890 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"250d2c0d-993b-466a-a5e0-bacae5fe8df5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a972be902ac9e0bc88e6b09c4b66e4b4e11c094e2bfd979332580f5c31c47d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gs886\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:53:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-knkqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:54:14Z is after 2025-08-24T17:21:41Z" Mar 07 06:54:14 crc kubenswrapper[4941]: I0307 06:54:14.064685 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-w48fj" podStartSLOduration=116.064654093 podStartE2EDuration="1m56.064654093s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:54:14.050887556 +0000 UTC m=+151.003253041" watchObservedRunningTime="2026-03-07 06:54:14.064654093 +0000 UTC m=+151.017019558" Mar 07 06:54:14 crc kubenswrapper[4941]: I0307 06:54:14.065053 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kc9rw" podStartSLOduration=117.065044464 podStartE2EDuration="1m57.065044464s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:54:14.064864768 +0000 UTC m=+151.017230253" watchObservedRunningTime="2026-03-07 06:54:14.065044464 +0000 UTC m=+151.017409939" Mar 07 06:54:14 crc kubenswrapper[4941]: E0307 06:54:14.081673 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:54:14 crc kubenswrapper[4941]: I0307 06:54:14.173131 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=54.173112517 podStartE2EDuration="54.173112517s" podCreationTimestamp="2026-03-07 06:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:54:14.172669624 +0000 UTC m=+151.125035099" watchObservedRunningTime="2026-03-07 06:54:14.173112517 +0000 UTC m=+151.125477982" Mar 07 06:54:14 crc kubenswrapper[4941]: I0307 06:54:14.198429 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=25.198382906 podStartE2EDuration="25.198382906s" podCreationTimestamp="2026-03-07 06:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:54:14.198003316 +0000 UTC m=+151.150368791" watchObservedRunningTime="2026-03-07 06:54:14.198382906 +0000 UTC m=+151.150748371" Mar 07 06:54:14 crc kubenswrapper[4941]: I0307 06:54:14.198992 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=55.198984813 podStartE2EDuration="55.198984813s" podCreationTimestamp="2026-03-07 06:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:54:14.187270684 +0000 UTC m=+151.139636149" watchObservedRunningTime="2026-03-07 06:54:14.198984813 +0000 UTC m=+151.151350278" Mar 07 06:54:14 crc kubenswrapper[4941]: I0307 06:54:14.954356 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:14 crc kubenswrapper[4941]: E0307 06:54:14.955171 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:14 crc kubenswrapper[4941]: I0307 06:54:14.966858 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 07 06:54:15 crc kubenswrapper[4941]: I0307 06:54:15.954051 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:15 crc kubenswrapper[4941]: E0307 06:54:15.954238 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:15 crc kubenswrapper[4941]: I0307 06:54:15.954618 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:15 crc kubenswrapper[4941]: I0307 06:54:15.954694 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:15 crc kubenswrapper[4941]: E0307 06:54:15.954846 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:15 crc kubenswrapper[4941]: E0307 06:54:15.954918 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.659996 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.660033 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.660042 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.660058 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.660068 4941 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T06:54:16Z","lastTransitionTime":"2026-03-07T06:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.710537 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7"] Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.711594 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.714061 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.714148 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.714615 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.719885 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.751036 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fff607f-b0f7-4c60-b202-5948a682e3d6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.751136 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1fff607f-b0f7-4c60-b202-5948a682e3d6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.751164 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fff607f-b0f7-4c60-b202-5948a682e3d6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.751331 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1fff607f-b0f7-4c60-b202-5948a682e3d6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.751374 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1fff607f-b0f7-4c60-b202-5948a682e3d6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.756236 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=47.756201945 podStartE2EDuration="47.756201945s" podCreationTimestamp="2026-03-07 06:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:54:16.740755132 +0000 UTC m=+153.693120597" watchObservedRunningTime="2026-03-07 06:54:16.756201945 +0000 UTC m=+153.708567430" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.768722 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vm6ql" podStartSLOduration=119.768690646 podStartE2EDuration="1m59.768690646s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:54:16.768455189 +0000 UTC m=+153.720820654" watchObservedRunningTime="2026-03-07 06:54:16.768690646 +0000 UTC m=+153.721056131" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.795456 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-q9xqh" podStartSLOduration=119.795430497 podStartE2EDuration="1m59.795430497s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:54:16.786698012 +0000 UTC m=+153.739063477" watchObservedRunningTime="2026-03-07 06:54:16.795430497 +0000 UTC m=+153.747795962" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.795965 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lv4jp" podStartSLOduration=119.795957621 podStartE2EDuration="1m59.795957621s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:54:16.795354715 +0000 UTC m=+153.747720180" watchObservedRunningTime="2026-03-07 06:54:16.795957621 +0000 UTC m=+153.748323097" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.816244 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.81622112 podStartE2EDuration="2.81622112s" podCreationTimestamp="2026-03-07 06:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:54:16.815322045 +0000 UTC m=+153.767687510" watchObservedRunningTime="2026-03-07 06:54:16.81622112 +0000 UTC m=+153.768586585" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.827513 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podStartSLOduration=119.827471036 podStartE2EDuration="1m59.827471036s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:54:16.826558111 +0000 UTC m=+153.778923566" watchObservedRunningTime="2026-03-07 06:54:16.827471036 +0000 UTC m=+153.779836501" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.852373 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1fff607f-b0f7-4c60-b202-5948a682e3d6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.852484 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fff607f-b0f7-4c60-b202-5948a682e3d6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.852550 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1fff607f-b0f7-4c60-b202-5948a682e3d6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.852577 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1fff607f-b0f7-4c60-b202-5948a682e3d6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.852625 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1fff607f-b0f7-4c60-b202-5948a682e3d6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.852646 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1fff607f-b0f7-4c60-b202-5948a682e3d6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.852675 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fff607f-b0f7-4c60-b202-5948a682e3d6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.853796 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fff607f-b0f7-4c60-b202-5948a682e3d6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.859916 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fff607f-b0f7-4c60-b202-5948a682e3d6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.868509 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1fff607f-b0f7-4c60-b202-5948a682e3d6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gc7m7\" (UID: \"1fff607f-b0f7-4c60-b202-5948a682e3d6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.953999 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:16 crc kubenswrapper[4941]: E0307 06:54:16.954163 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.986312 4941 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 07 06:54:16 crc kubenswrapper[4941]: I0307 06:54:16.996618 4941 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 07 06:54:17 crc kubenswrapper[4941]: I0307 06:54:17.034794 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" Mar 07 06:54:17 crc kubenswrapper[4941]: I0307 06:54:17.841963 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" event={"ID":"1fff607f-b0f7-4c60-b202-5948a682e3d6","Type":"ContainerStarted","Data":"b514f27a84a031ee12cc45e0dfb9f9c854beaee38f62f2bdb132d698020005c6"} Mar 07 06:54:17 crc kubenswrapper[4941]: I0307 06:54:17.842063 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" event={"ID":"1fff607f-b0f7-4c60-b202-5948a682e3d6","Type":"ContainerStarted","Data":"2134cd4764b9ee7ac47c0946aa052b7dd1a404277eb75f79c7487f58ca31f5a9"} Mar 07 06:54:17 crc kubenswrapper[4941]: I0307 06:54:17.870317 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gc7m7" podStartSLOduration=120.870281515 podStartE2EDuration="2m0.870281515s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:54:17.868016892 +0000 UTC m=+154.820382407" watchObservedRunningTime="2026-03-07 06:54:17.870281515 +0000 UTC m=+154.822647021" Mar 07 06:54:17 crc kubenswrapper[4941]: I0307 06:54:17.953822 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:17 crc kubenswrapper[4941]: E0307 06:54:17.953973 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:17 crc kubenswrapper[4941]: I0307 06:54:17.954156 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:17 crc kubenswrapper[4941]: E0307 06:54:17.954227 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:17 crc kubenswrapper[4941]: I0307 06:54:17.954466 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:17 crc kubenswrapper[4941]: E0307 06:54:17.954619 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:18 crc kubenswrapper[4941]: I0307 06:54:18.954050 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:18 crc kubenswrapper[4941]: E0307 06:54:18.954304 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:19 crc kubenswrapper[4941]: E0307 06:54:19.083130 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:54:19 crc kubenswrapper[4941]: I0307 06:54:19.954220 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:19 crc kubenswrapper[4941]: E0307 06:54:19.954435 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:19 crc kubenswrapper[4941]: I0307 06:54:19.954220 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:19 crc kubenswrapper[4941]: I0307 06:54:19.954726 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:19 crc kubenswrapper[4941]: E0307 06:54:19.954829 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:19 crc kubenswrapper[4941]: E0307 06:54:19.955010 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:20 crc kubenswrapper[4941]: I0307 06:54:20.954162 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:20 crc kubenswrapper[4941]: E0307 06:54:20.954542 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:20 crc kubenswrapper[4941]: I0307 06:54:20.954815 4941 scope.go:117] "RemoveContainer" containerID="278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68" Mar 07 06:54:20 crc kubenswrapper[4941]: E0307 06:54:20.954966 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" Mar 07 06:54:21 crc kubenswrapper[4941]: I0307 06:54:21.954429 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:21 crc kubenswrapper[4941]: E0307 06:54:21.954604 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:21 crc kubenswrapper[4941]: I0307 06:54:21.954457 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:21 crc kubenswrapper[4941]: E0307 06:54:21.954693 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:21 crc kubenswrapper[4941]: I0307 06:54:21.954429 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:21 crc kubenswrapper[4941]: E0307 06:54:21.956387 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:22 crc kubenswrapper[4941]: I0307 06:54:22.954182 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:22 crc kubenswrapper[4941]: E0307 06:54:22.954347 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:23 crc kubenswrapper[4941]: I0307 06:54:23.953685 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:23 crc kubenswrapper[4941]: I0307 06:54:23.953737 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:23 crc kubenswrapper[4941]: I0307 06:54:23.956456 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:23 crc kubenswrapper[4941]: E0307 06:54:23.956435 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:23 crc kubenswrapper[4941]: E0307 06:54:23.956628 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:23 crc kubenswrapper[4941]: E0307 06:54:23.956911 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:24 crc kubenswrapper[4941]: E0307 06:54:24.083738 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:54:24 crc kubenswrapper[4941]: I0307 06:54:24.953727 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:24 crc kubenswrapper[4941]: E0307 06:54:24.954126 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:25 crc kubenswrapper[4941]: I0307 06:54:25.953485 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:25 crc kubenswrapper[4941]: I0307 06:54:25.953618 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:25 crc kubenswrapper[4941]: E0307 06:54:25.953649 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:25 crc kubenswrapper[4941]: I0307 06:54:25.953703 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:25 crc kubenswrapper[4941]: E0307 06:54:25.953824 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:25 crc kubenswrapper[4941]: E0307 06:54:25.953946 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:26 crc kubenswrapper[4941]: I0307 06:54:26.953927 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:26 crc kubenswrapper[4941]: E0307 06:54:26.954162 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:27 crc kubenswrapper[4941]: I0307 06:54:27.954198 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:27 crc kubenswrapper[4941]: I0307 06:54:27.954285 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:27 crc kubenswrapper[4941]: E0307 06:54:27.954630 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:27 crc kubenswrapper[4941]: I0307 06:54:27.954672 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:27 crc kubenswrapper[4941]: E0307 06:54:27.954816 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:27 crc kubenswrapper[4941]: E0307 06:54:27.954937 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:28 crc kubenswrapper[4941]: I0307 06:54:28.954128 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:28 crc kubenswrapper[4941]: E0307 06:54:28.954572 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:29 crc kubenswrapper[4941]: E0307 06:54:29.085481 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:54:29 crc kubenswrapper[4941]: I0307 06:54:29.953845 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:29 crc kubenswrapper[4941]: I0307 06:54:29.953914 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:29 crc kubenswrapper[4941]: I0307 06:54:29.953922 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:29 crc kubenswrapper[4941]: E0307 06:54:29.954109 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:29 crc kubenswrapper[4941]: E0307 06:54:29.954208 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:29 crc kubenswrapper[4941]: E0307 06:54:29.954269 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:30 crc kubenswrapper[4941]: I0307 06:54:30.954388 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:30 crc kubenswrapper[4941]: E0307 06:54:30.954617 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:31 crc kubenswrapper[4941]: I0307 06:54:31.953969 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:31 crc kubenswrapper[4941]: I0307 06:54:31.954047 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:31 crc kubenswrapper[4941]: E0307 06:54:31.954142 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:31 crc kubenswrapper[4941]: E0307 06:54:31.954280 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:31 crc kubenswrapper[4941]: I0307 06:54:31.954289 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:31 crc kubenswrapper[4941]: E0307 06:54:31.954572 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:32 crc kubenswrapper[4941]: I0307 06:54:32.954107 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:32 crc kubenswrapper[4941]: E0307 06:54:32.954306 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:33 crc kubenswrapper[4941]: I0307 06:54:33.954699 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:33 crc kubenswrapper[4941]: I0307 06:54:33.956928 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:33 crc kubenswrapper[4941]: I0307 06:54:33.957121 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:33 crc kubenswrapper[4941]: I0307 06:54:33.957791 4941 scope.go:117] "RemoveContainer" containerID="278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68" Mar 07 06:54:33 crc kubenswrapper[4941]: E0307 06:54:33.957777 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:33 crc kubenswrapper[4941]: E0307 06:54:33.957939 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:33 crc kubenswrapper[4941]: E0307 06:54:33.957983 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" Mar 07 06:54:33 crc kubenswrapper[4941]: E0307 06:54:33.958030 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:34 crc kubenswrapper[4941]: E0307 06:54:34.086160 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:54:34 crc kubenswrapper[4941]: I0307 06:54:34.954449 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:34 crc kubenswrapper[4941]: E0307 06:54:34.954667 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:35 crc kubenswrapper[4941]: I0307 06:54:35.953894 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:35 crc kubenswrapper[4941]: I0307 06:54:35.953981 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:35 crc kubenswrapper[4941]: E0307 06:54:35.954024 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:35 crc kubenswrapper[4941]: E0307 06:54:35.954119 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:35 crc kubenswrapper[4941]: I0307 06:54:35.954262 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:35 crc kubenswrapper[4941]: E0307 06:54:35.954316 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:36 crc kubenswrapper[4941]: I0307 06:54:36.953575 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:36 crc kubenswrapper[4941]: E0307 06:54:36.953895 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:37 crc kubenswrapper[4941]: I0307 06:54:37.954493 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:37 crc kubenswrapper[4941]: I0307 06:54:37.954528 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:37 crc kubenswrapper[4941]: I0307 06:54:37.954738 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:37 crc kubenswrapper[4941]: E0307 06:54:37.954734 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:37 crc kubenswrapper[4941]: E0307 06:54:37.954943 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:37 crc kubenswrapper[4941]: E0307 06:54:37.955096 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:38 crc kubenswrapper[4941]: I0307 06:54:38.954268 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:38 crc kubenswrapper[4941]: E0307 06:54:38.954601 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:39 crc kubenswrapper[4941]: E0307 06:54:39.087362 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:54:39 crc kubenswrapper[4941]: I0307 06:54:39.953685 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:39 crc kubenswrapper[4941]: E0307 06:54:39.953899 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:39 crc kubenswrapper[4941]: I0307 06:54:39.954251 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:39 crc kubenswrapper[4941]: I0307 06:54:39.954321 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:39 crc kubenswrapper[4941]: E0307 06:54:39.954431 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:39 crc kubenswrapper[4941]: E0307 06:54:39.954827 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:40 crc kubenswrapper[4941]: I0307 06:54:40.953579 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:40 crc kubenswrapper[4941]: E0307 06:54:40.953801 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:41 crc kubenswrapper[4941]: I0307 06:54:41.954018 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:41 crc kubenswrapper[4941]: I0307 06:54:41.954184 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:41 crc kubenswrapper[4941]: E0307 06:54:41.954312 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:41 crc kubenswrapper[4941]: I0307 06:54:41.954355 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:41 crc kubenswrapper[4941]: E0307 06:54:41.954453 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:41 crc kubenswrapper[4941]: E0307 06:54:41.954516 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:42 crc kubenswrapper[4941]: I0307 06:54:42.954327 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:42 crc kubenswrapper[4941]: E0307 06:54:42.954539 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:43 crc kubenswrapper[4941]: I0307 06:54:43.935636 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kc9rw_ed82bc0c-1609-449c-b2e2-2fe04af9749d/kube-multus/1.log" Mar 07 06:54:43 crc kubenswrapper[4941]: I0307 06:54:43.936510 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kc9rw_ed82bc0c-1609-449c-b2e2-2fe04af9749d/kube-multus/0.log" Mar 07 06:54:43 crc kubenswrapper[4941]: I0307 06:54:43.936688 4941 generic.go:334] "Generic (PLEG): container finished" podID="ed82bc0c-1609-449c-b2e2-2fe04af9749d" containerID="9c8147115ce051f627cdbae790a127df650235a4ea68236d79581d1848d46261" exitCode=1 Mar 07 06:54:43 crc kubenswrapper[4941]: I0307 06:54:43.936752 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kc9rw" event={"ID":"ed82bc0c-1609-449c-b2e2-2fe04af9749d","Type":"ContainerDied","Data":"9c8147115ce051f627cdbae790a127df650235a4ea68236d79581d1848d46261"} Mar 07 06:54:43 crc kubenswrapper[4941]: I0307 06:54:43.936834 4941 scope.go:117] "RemoveContainer" containerID="9cfc34f6710ac06bdeb2089997f846292163dd5d75e4f206baa93e78fce9e1a7" Mar 07 06:54:43 crc kubenswrapper[4941]: I0307 06:54:43.937242 4941 scope.go:117] "RemoveContainer" containerID="9c8147115ce051f627cdbae790a127df650235a4ea68236d79581d1848d46261" Mar 07 06:54:43 crc kubenswrapper[4941]: E0307 06:54:43.937536 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-kc9rw_openshift-multus(ed82bc0c-1609-449c-b2e2-2fe04af9749d)\"" pod="openshift-multus/multus-kc9rw" podUID="ed82bc0c-1609-449c-b2e2-2fe04af9749d" Mar 07 06:54:43 crc kubenswrapper[4941]: I0307 06:54:43.954187 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:43 crc kubenswrapper[4941]: I0307 06:54:43.954187 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:43 crc kubenswrapper[4941]: E0307 06:54:43.954782 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:43 crc kubenswrapper[4941]: E0307 06:54:43.955469 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:43 crc kubenswrapper[4941]: I0307 06:54:43.956018 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:43 crc kubenswrapper[4941]: E0307 06:54:43.956111 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:44 crc kubenswrapper[4941]: E0307 06:54:44.087950 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:54:44 crc kubenswrapper[4941]: I0307 06:54:44.942538 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kc9rw_ed82bc0c-1609-449c-b2e2-2fe04af9749d/kube-multus/1.log" Mar 07 06:54:44 crc kubenswrapper[4941]: I0307 06:54:44.953728 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:44 crc kubenswrapper[4941]: E0307 06:54:44.953909 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:45 crc kubenswrapper[4941]: I0307 06:54:45.953923 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:45 crc kubenswrapper[4941]: E0307 06:54:45.954034 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:45 crc kubenswrapper[4941]: I0307 06:54:45.953923 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:45 crc kubenswrapper[4941]: E0307 06:54:45.954394 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:45 crc kubenswrapper[4941]: I0307 06:54:45.954737 4941 scope.go:117] "RemoveContainer" containerID="278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68" Mar 07 06:54:45 crc kubenswrapper[4941]: E0307 06:54:45.954895 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5ztp_openshift-ovn-kubernetes(c3469f59-621c-4493-ade3-768772d05ebd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" Mar 07 06:54:45 crc kubenswrapper[4941]: I0307 06:54:45.957923 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:45 crc kubenswrapper[4941]: E0307 06:54:45.958014 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:46 crc kubenswrapper[4941]: I0307 06:54:46.953785 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:46 crc kubenswrapper[4941]: E0307 06:54:46.953947 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:47 crc kubenswrapper[4941]: I0307 06:54:47.953700 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:47 crc kubenswrapper[4941]: I0307 06:54:47.953764 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:47 crc kubenswrapper[4941]: E0307 06:54:47.953832 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:47 crc kubenswrapper[4941]: I0307 06:54:47.953785 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:47 crc kubenswrapper[4941]: E0307 06:54:47.953980 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:47 crc kubenswrapper[4941]: E0307 06:54:47.954041 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:48 crc kubenswrapper[4941]: I0307 06:54:48.953896 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:48 crc kubenswrapper[4941]: E0307 06:54:48.954097 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:49 crc kubenswrapper[4941]: E0307 06:54:49.089342 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:54:49 crc kubenswrapper[4941]: I0307 06:54:49.954374 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:49 crc kubenswrapper[4941]: E0307 06:54:49.954530 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:49 crc kubenswrapper[4941]: I0307 06:54:49.954581 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:49 crc kubenswrapper[4941]: E0307 06:54:49.954623 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:49 crc kubenswrapper[4941]: I0307 06:54:49.954621 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:49 crc kubenswrapper[4941]: E0307 06:54:49.954872 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:50 crc kubenswrapper[4941]: I0307 06:54:50.953839 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:50 crc kubenswrapper[4941]: E0307 06:54:50.954713 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:51 crc kubenswrapper[4941]: I0307 06:54:51.954134 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:51 crc kubenswrapper[4941]: I0307 06:54:51.954183 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:51 crc kubenswrapper[4941]: E0307 06:54:51.954333 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:51 crc kubenswrapper[4941]: I0307 06:54:51.954435 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:51 crc kubenswrapper[4941]: E0307 06:54:51.954574 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:51 crc kubenswrapper[4941]: E0307 06:54:51.954691 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:52 crc kubenswrapper[4941]: I0307 06:54:52.954097 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:52 crc kubenswrapper[4941]: E0307 06:54:52.954296 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:53 crc kubenswrapper[4941]: I0307 06:54:53.953736 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:53 crc kubenswrapper[4941]: I0307 06:54:53.953775 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:53 crc kubenswrapper[4941]: I0307 06:54:53.954724 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:53 crc kubenswrapper[4941]: E0307 06:54:53.954754 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:53 crc kubenswrapper[4941]: E0307 06:54:53.954831 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:53 crc kubenswrapper[4941]: E0307 06:54:53.954949 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:54 crc kubenswrapper[4941]: E0307 06:54:54.089973 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:54:54 crc kubenswrapper[4941]: I0307 06:54:54.954148 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:54 crc kubenswrapper[4941]: E0307 06:54:54.954354 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:55 crc kubenswrapper[4941]: I0307 06:54:55.954085 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:55 crc kubenswrapper[4941]: I0307 06:54:55.954085 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:55 crc kubenswrapper[4941]: E0307 06:54:55.954223 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:55 crc kubenswrapper[4941]: E0307 06:54:55.954382 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:55 crc kubenswrapper[4941]: I0307 06:54:55.954443 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:55 crc kubenswrapper[4941]: E0307 06:54:55.954513 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:56 crc kubenswrapper[4941]: I0307 06:54:56.954582 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:56 crc kubenswrapper[4941]: E0307 06:54:56.954786 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:57 crc kubenswrapper[4941]: I0307 06:54:57.954354 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:57 crc kubenswrapper[4941]: E0307 06:54:57.955098 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:57 crc kubenswrapper[4941]: I0307 06:54:57.955313 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:57 crc kubenswrapper[4941]: I0307 06:54:57.955128 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:57 crc kubenswrapper[4941]: I0307 06:54:57.956102 4941 scope.go:117] "RemoveContainer" containerID="9c8147115ce051f627cdbae790a127df650235a4ea68236d79581d1848d46261" Mar 07 06:54:57 crc kubenswrapper[4941]: E0307 06:54:57.955990 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:57 crc kubenswrapper[4941]: I0307 06:54:57.956263 4941 scope.go:117] "RemoveContainer" containerID="278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68" Mar 07 06:54:57 crc kubenswrapper[4941]: E0307 06:54:57.956492 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:54:58 crc kubenswrapper[4941]: I0307 06:54:58.838487 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q9fpr"] Mar 07 06:54:58 crc kubenswrapper[4941]: I0307 06:54:58.839191 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:54:58 crc kubenswrapper[4941]: E0307 06:54:58.839355 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:54:58 crc kubenswrapper[4941]: I0307 06:54:58.954513 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:54:58 crc kubenswrapper[4941]: E0307 06:54:58.954790 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:54:58 crc kubenswrapper[4941]: I0307 06:54:58.989497 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/3.log" Mar 07 06:54:58 crc kubenswrapper[4941]: I0307 06:54:58.992064 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerStarted","Data":"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753"} Mar 07 06:54:58 crc kubenswrapper[4941]: I0307 06:54:58.992752 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:54:58 crc kubenswrapper[4941]: I0307 06:54:58.993575 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kc9rw_ed82bc0c-1609-449c-b2e2-2fe04af9749d/kube-multus/1.log" Mar 07 06:54:58 crc kubenswrapper[4941]: I0307 06:54:58.993623 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kc9rw" event={"ID":"ed82bc0c-1609-449c-b2e2-2fe04af9749d","Type":"ContainerStarted","Data":"308a567803d153cdae67d929aaf40daee703177d7713986faf0d6ac3e4e79eb6"} Mar 07 06:54:59 crc kubenswrapper[4941]: I0307 06:54:59.025065 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podStartSLOduration=162.025041092 podStartE2EDuration="2m42.025041092s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:54:59.025016272 +0000 UTC m=+195.977381737" watchObservedRunningTime="2026-03-07 06:54:59.025041092 +0000 UTC m=+195.977406557" Mar 07 06:54:59 crc kubenswrapper[4941]: E0307 06:54:59.092305 4941 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 06:54:59 crc kubenswrapper[4941]: I0307 06:54:59.954189 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:54:59 crc kubenswrapper[4941]: E0307 06:54:59.954366 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:54:59 crc kubenswrapper[4941]: I0307 06:54:59.954487 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:54:59 crc kubenswrapper[4941]: E0307 06:54:59.954732 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:55:00 crc kubenswrapper[4941]: I0307 06:55:00.954522 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:55:00 crc kubenswrapper[4941]: I0307 06:55:00.954581 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:55:00 crc kubenswrapper[4941]: E0307 06:55:00.954674 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:55:00 crc kubenswrapper[4941]: E0307 06:55:00.954825 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:55:01 crc kubenswrapper[4941]: I0307 06:55:01.953955 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:55:01 crc kubenswrapper[4941]: I0307 06:55:01.953991 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:55:01 crc kubenswrapper[4941]: E0307 06:55:01.954191 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:55:01 crc kubenswrapper[4941]: E0307 06:55:01.954262 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:55:02 crc kubenswrapper[4941]: I0307 06:55:02.954451 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:55:02 crc kubenswrapper[4941]: I0307 06:55:02.954476 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:55:02 crc kubenswrapper[4941]: E0307 06:55:02.954624 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 06:55:02 crc kubenswrapper[4941]: E0307 06:55:02.954733 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q9fpr" podUID="80030e60-caa3-4aad-8b00-10f5143d9243" Mar 07 06:55:03 crc kubenswrapper[4941]: I0307 06:55:03.955323 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:55:03 crc kubenswrapper[4941]: E0307 06:55:03.955441 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 06:55:03 crc kubenswrapper[4941]: I0307 06:55:03.955621 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:55:03 crc kubenswrapper[4941]: E0307 06:55:03.955673 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 06:55:04 crc kubenswrapper[4941]: I0307 06:55:04.954490 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:55:04 crc kubenswrapper[4941]: I0307 06:55:04.954506 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:55:04 crc kubenswrapper[4941]: I0307 06:55:04.956660 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 07 06:55:04 crc kubenswrapper[4941]: I0307 06:55:04.956835 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 07 06:55:04 crc kubenswrapper[4941]: I0307 06:55:04.957868 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 07 06:55:04 crc kubenswrapper[4941]: I0307 06:55:04.957914 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 07 06:55:05 crc kubenswrapper[4941]: I0307 06:55:05.954488 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:55:05 crc kubenswrapper[4941]: I0307 06:55:05.954513 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:55:05 crc kubenswrapper[4941]: I0307 06:55:05.957604 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 07 06:55:05 crc kubenswrapper[4941]: I0307 06:55:05.959007 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.668150 4941 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.756790 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-j2bnz"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.758045 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.763751 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.765468 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.766222 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.767452 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.767948 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hvmvs"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.775668 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.775873 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v76pm"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.776134 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.776339 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.776464 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.776336 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.777080 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.778451 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5bq7n"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.778974 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.781501 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.788968 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.789193 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.789390 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.791089 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8n7vr"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.791571 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.793708 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.794554 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.798346 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.798848 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.804752 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hr792"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.805386 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.805888 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-x2rhs"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.806696 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-x2rhs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.818551 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nwzjs"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.819163 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.821218 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ljm6r"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.821751 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.823718 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cdrc6"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.824113 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ntlhm"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.824501 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ntlhm" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.824667 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.834616 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6959d"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.835766 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.836492 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.837009 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.837068 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.837663 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.844509 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.845135 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.849048 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.849583 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.850110 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.850658 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-5tr44"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.850942 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5tr44" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.851077 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.851571 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.852248 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.853697 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wpb7c"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.854699 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.855595 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.855613 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.886480 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cb2gw"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.888051 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cb2gw" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.894885 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.896570 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.914953 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.913597 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm4jh\" (UniqueName: \"kubernetes.io/projected/bb8c0212-2a6d-4636-a75b-08a350f5948f-kube-api-access-cm4jh\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.931804 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-service-ca\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.931840 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-config\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.931870 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-audit-policies\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.931902 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fw49\" (UniqueName: \"kubernetes.io/projected/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-kube-api-access-6fw49\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.931927 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.931951 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r28vr\" (UniqueName: \"kubernetes.io/projected/612ac789-5007-4e17-a81a-cf753c2acadc-kube-api-access-r28vr\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.931970 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b-config\") pod \"console-operator-58897d9998-ljm6r\" (UID: \"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b\") " pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.931994 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vstnp\" (UniqueName: \"kubernetes.io/projected/46da50cb-1038-4289-be6d-e5f3b4c70ab3-kube-api-access-vstnp\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932016 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-config\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932037 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932059 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb8c0212-2a6d-4636-a75b-08a350f5948f-serving-cert\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932086 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9beea563-7739-4b17-b360-bb769400bdff-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ccz2v\" (UID: \"9beea563-7739-4b17-b360-bb769400bdff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932109 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rs9x\" (UniqueName: \"kubernetes.io/projected/f2470875-024a-4ef0-9e01-20bbbfff60bc-kube-api-access-8rs9x\") pod \"dns-operator-744455d44c-x2rhs\" (UID: \"f2470875-024a-4ef0-9e01-20bbbfff60bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-x2rhs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932170 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f291eed-0e60-43a7-a34a-2f7ad9788126-service-ca-bundle\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932214 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shw46\" (UniqueName: \"kubernetes.io/projected/b14f5741-27aa-4ff2-a2dc-69c385a07e16-kube-api-access-shw46\") pod \"machine-approver-56656f9798-hr792\" (UID: \"b14f5741-27aa-4ff2-a2dc-69c385a07e16\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932239 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932272 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932297 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2470875-024a-4ef0-9e01-20bbbfff60bc-metrics-tls\") pod \"dns-operator-744455d44c-x2rhs\" (UID: \"f2470875-024a-4ef0-9e01-20bbbfff60bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-x2rhs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932330 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-serving-cert\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932355 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/48b6255a-3390-4e84-bed2-6a28fc0c9800-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6959d\" (UID: \"48b6255a-3390-4e84-bed2-6a28fc0c9800\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932379 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wb5b\" (UniqueName: \"kubernetes.io/projected/44081086-ee5b-4b26-8af9-f35aa03402fc-kube-api-access-4wb5b\") pod \"migrator-59844c95c7-ntlhm\" (UID: \"44081086-ee5b-4b26-8af9-f35aa03402fc\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ntlhm" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932428 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932452 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932479 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932542 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-serving-cert\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932594 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-oauth-config\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932620 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x4nd\" (UniqueName: \"kubernetes.io/projected/79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b-kube-api-access-4x4nd\") pod \"console-operator-58897d9998-ljm6r\" (UID: \"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b\") " pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932641 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-etcd-client\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932668 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b14f5741-27aa-4ff2-a2dc-69c385a07e16-auth-proxy-config\") pod \"machine-approver-56656f9798-hr792\" (UID: \"b14f5741-27aa-4ff2-a2dc-69c385a07e16\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932695 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-audit-dir\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932717 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b14f5741-27aa-4ff2-a2dc-69c385a07e16-config\") pod \"machine-approver-56656f9798-hr792\" (UID: \"b14f5741-27aa-4ff2-a2dc-69c385a07e16\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932739 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-client-ca\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932762 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-oauth-serving-cert\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932801 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-config\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932823 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-trusted-ca-bundle\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932845 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932874 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-etcd-serving-ca\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932895 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-encryption-config\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932938 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932961 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b14f5741-27aa-4ff2-a2dc-69c385a07e16-machine-approver-tls\") pod \"machine-approver-56656f9798-hr792\" (UID: \"b14f5741-27aa-4ff2-a2dc-69c385a07e16\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.932984 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-node-pullsecrets\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933008 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-image-import-ca\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933034 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/612ac789-5007-4e17-a81a-cf753c2acadc-audit-dir\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933057 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25380ea-ea92-42bb-bd73-4da399dc0cc4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lbbxx\" (UID: \"e25380ea-ea92-42bb-bd73-4da399dc0cc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933082 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llhvz\" (UniqueName: \"kubernetes.io/projected/e25380ea-ea92-42bb-bd73-4da399dc0cc4-kube-api-access-llhvz\") pod \"openshift-apiserver-operator-796bbdcf4f-lbbxx\" (UID: \"e25380ea-ea92-42bb-bd73-4da399dc0cc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933106 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7c60c91-094d-4c52-9dcb-36ad07c829ad-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5bq7n\" (UID: \"a7c60c91-094d-4c52-9dcb-36ad07c829ad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933130 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b-serving-cert\") pod \"console-operator-58897d9998-ljm6r\" (UID: \"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b\") " pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933152 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plk2b\" (UniqueName: \"kubernetes.io/projected/48b6255a-3390-4e84-bed2-6a28fc0c9800-kube-api-access-plk2b\") pod \"openshift-config-operator-7777fb866f-6959d\" (UID: \"48b6255a-3390-4e84-bed2-6a28fc0c9800\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933193 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-etcd-client\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933217 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a7c60c91-094d-4c52-9dcb-36ad07c829ad-images\") pod \"machine-api-operator-5694c8668f-5bq7n\" (UID: \"a7c60c91-094d-4c52-9dcb-36ad07c829ad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933242 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-serving-cert\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933268 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933294 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933316 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f291eed-0e60-43a7-a34a-2f7ad9788126-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933357 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-etcd-ca\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933379 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzrp4\" (UniqueName: \"kubernetes.io/projected/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-kube-api-access-mzrp4\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933746 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg9j4\" (UniqueName: \"kubernetes.io/projected/a7c60c91-094d-4c52-9dcb-36ad07c829ad-kube-api-access-rg9j4\") pod \"machine-api-operator-5694c8668f-5bq7n\" (UID: \"a7c60c91-094d-4c52-9dcb-36ad07c829ad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933793 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b6255a-3390-4e84-bed2-6a28fc0c9800-serving-cert\") pod \"openshift-config-operator-7777fb866f-6959d\" (UID: \"48b6255a-3390-4e84-bed2-6a28fc0c9800\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933816 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5gwt\" (UniqueName: \"kubernetes.io/projected/9beea563-7739-4b17-b360-bb769400bdff-kube-api-access-r5gwt\") pod \"cluster-samples-operator-665b6dd947-ccz2v\" (UID: \"9beea563-7739-4b17-b360-bb769400bdff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933844 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f291eed-0e60-43a7-a34a-2f7ad9788126-config\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933867 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x54rk\" (UniqueName: \"kubernetes.io/projected/3f291eed-0e60-43a7-a34a-2f7ad9788126-kube-api-access-x54rk\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933889 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-etcd-service-ca\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933910 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-audit\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933932 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e25380ea-ea92-42bb-bd73-4da399dc0cc4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lbbxx\" (UID: \"e25380ea-ea92-42bb-bd73-4da399dc0cc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933953 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-config\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933975 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.933993 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b-trusted-ca\") pod \"console-operator-58897d9998-ljm6r\" (UID: \"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b\") " pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.934011 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f291eed-0e60-43a7-a34a-2f7ad9788126-serving-cert\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.934034 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.934054 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7c60c91-094d-4c52-9dcb-36ad07c829ad-config\") pod \"machine-api-operator-5694c8668f-5bq7n\" (UID: \"a7c60c91-094d-4c52-9dcb-36ad07c829ad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.934244 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.934528 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.934785 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.935621 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.936156 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.936434 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.936812 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.936930 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.937093 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.937203 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.938026 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-w745p"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.938686 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.941794 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8tds9"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.942441 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.947112 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-29stc"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.947678 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.948231 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.948290 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.948672 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.949180 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.950419 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.950606 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.956822 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.956996 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.957160 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.957244 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.957345 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.957450 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.957627 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.957766 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.958088 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.958131 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.958868 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.959210 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.960138 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.961290 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.961590 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.961722 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.961815 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.961827 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.961969 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.962067 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.962122 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.962172 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.962327 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.962370 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.962461 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.962520 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.962548 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.962684 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.962794 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.962818 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.962928 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.963068 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.963097 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.963203 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.963243 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.963310 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.963374 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.963515 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.963553 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.963743 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.963789 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.963883 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.963943 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.963897 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964136 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964147 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964252 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964317 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964383 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964444 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964551 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964561 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964669 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964722 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964761 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964674 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964838 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964910 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964963 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.964997 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.972485 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7x6zc"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.973092 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.973433 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.973642 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.973759 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547774-2b2fh"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.973912 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.973995 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.974083 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.974274 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.974484 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.974576 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.975033 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547774-2b2fh" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.975617 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.975850 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.977242 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.977576 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.977937 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.978055 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.978269 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.983036 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.983394 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.983539 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.983603 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.983812 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.984052 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.986261 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.987005 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.987976 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2"] Mar 07 06:55:07 crc kubenswrapper[4941]: I0307 06:55:07.994963 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.002073 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.004566 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.028222 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.029781 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.029856 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.032220 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.033042 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.035556 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-trusted-ca-bundle\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.035601 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.035626 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-encryption-config\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.035644 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-etcd-serving-ca\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.035664 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.035683 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b14f5741-27aa-4ff2-a2dc-69c385a07e16-machine-approver-tls\") pod \"machine-approver-56656f9798-hr792\" (UID: \"b14f5741-27aa-4ff2-a2dc-69c385a07e16\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.035697 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-node-pullsecrets\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.035712 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-image-import-ca\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.035762 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/612ac789-5007-4e17-a81a-cf753c2acadc-audit-dir\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.035757 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.035940 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.035782 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25380ea-ea92-42bb-bd73-4da399dc0cc4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lbbxx\" (UID: \"e25380ea-ea92-42bb-bd73-4da399dc0cc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036433 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llhvz\" (UniqueName: \"kubernetes.io/projected/e25380ea-ea92-42bb-bd73-4da399dc0cc4-kube-api-access-llhvz\") pod \"openshift-apiserver-operator-796bbdcf4f-lbbxx\" (UID: \"e25380ea-ea92-42bb-bd73-4da399dc0cc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036480 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/755ab720-3d7a-4d86-8613-147ac07a93dd-config\") pod \"kube-controller-manager-operator-78b949d7b-qhnm7\" (UID: \"755ab720-3d7a-4d86-8613-147ac07a93dd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036507 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/755ab720-3d7a-4d86-8613-147ac07a93dd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qhnm7\" (UID: \"755ab720-3d7a-4d86-8613-147ac07a93dd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036534 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37fe4f8d-b1a8-4848-805b-f44095a2daeb-webhook-cert\") pod \"packageserver-d55dfcdfc-cp52v\" (UID: \"37fe4f8d-b1a8-4848-805b-f44095a2daeb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036562 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7c60c91-094d-4c52-9dcb-36ad07c829ad-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5bq7n\" (UID: \"a7c60c91-094d-4c52-9dcb-36ad07c829ad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036590 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b-serving-cert\") pod \"console-operator-58897d9998-ljm6r\" (UID: \"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b\") " pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036615 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plk2b\" (UniqueName: \"kubernetes.io/projected/48b6255a-3390-4e84-bed2-6a28fc0c9800-kube-api-access-plk2b\") pod \"openshift-config-operator-7777fb866f-6959d\" (UID: \"48b6255a-3390-4e84-bed2-6a28fc0c9800\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036642 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/451b0ac4-065b-46e3-813e-7b62a311e7eb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lhtqh\" (UID: \"451b0ac4-065b-46e3-813e-7b62a311e7eb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036688 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-etcd-client\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036712 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a7c60c91-094d-4c52-9dcb-36ad07c829ad-images\") pod \"machine-api-operator-5694c8668f-5bq7n\" (UID: \"a7c60c91-094d-4c52-9dcb-36ad07c829ad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036718 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-j2bnz"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036737 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hvmvs"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036752 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5bq7n"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036763 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.037191 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.037541 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.037650 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25380ea-ea92-42bb-bd73-4da399dc0cc4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lbbxx\" (UID: \"e25380ea-ea92-42bb-bd73-4da399dc0cc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.037722 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.036735 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m24s\" (UniqueName: \"kubernetes.io/projected/451b0ac4-065b-46e3-813e-7b62a311e7eb-kube-api-access-6m24s\") pod \"cluster-image-registry-operator-dc59b4c8b-lhtqh\" (UID: \"451b0ac4-065b-46e3-813e-7b62a311e7eb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.037927 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a004c43d-7acc-4a7e-afc1-947c31df55ad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c8tfd\" (UID: \"a004c43d-7acc-4a7e-afc1-947c31df55ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.037957 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-serving-cert\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.037978 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038019 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038041 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f291eed-0e60-43a7-a34a-2f7ad9788126-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038062 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-etcd-ca\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038082 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzrp4\" (UniqueName: \"kubernetes.io/projected/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-kube-api-access-mzrp4\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038112 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg9j4\" (UniqueName: \"kubernetes.io/projected/a7c60c91-094d-4c52-9dcb-36ad07c829ad-kube-api-access-rg9j4\") pod \"machine-api-operator-5694c8668f-5bq7n\" (UID: \"a7c60c91-094d-4c52-9dcb-36ad07c829ad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038131 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b6255a-3390-4e84-bed2-6a28fc0c9800-serving-cert\") pod \"openshift-config-operator-7777fb866f-6959d\" (UID: \"48b6255a-3390-4e84-bed2-6a28fc0c9800\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038155 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f291eed-0e60-43a7-a34a-2f7ad9788126-config\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038173 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x54rk\" (UniqueName: \"kubernetes.io/projected/3f291eed-0e60-43a7-a34a-2f7ad9788126-kube-api-access-x54rk\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038190 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5gwt\" (UniqueName: \"kubernetes.io/projected/9beea563-7739-4b17-b360-bb769400bdff-kube-api-access-r5gwt\") pod \"cluster-samples-operator-665b6dd947-ccz2v\" (UID: \"9beea563-7739-4b17-b360-bb769400bdff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038220 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-etcd-service-ca\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038235 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-audit\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038254 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e25380ea-ea92-42bb-bd73-4da399dc0cc4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lbbxx\" (UID: \"e25380ea-ea92-42bb-bd73-4da399dc0cc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038275 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37fe4f8d-b1a8-4848-805b-f44095a2daeb-apiservice-cert\") pod \"packageserver-d55dfcdfc-cp52v\" (UID: \"37fe4f8d-b1a8-4848-805b-f44095a2daeb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038295 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-config\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038316 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038341 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b-trusted-ca\") pod \"console-operator-58897d9998-ljm6r\" (UID: \"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b\") " pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038365 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f291eed-0e60-43a7-a34a-2f7ad9788126-serving-cert\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038382 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/755ab720-3d7a-4d86-8613-147ac07a93dd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qhnm7\" (UID: \"755ab720-3d7a-4d86-8613-147ac07a93dd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038421 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038441 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7c60c91-094d-4c52-9dcb-36ad07c829ad-config\") pod \"machine-api-operator-5694c8668f-5bq7n\" (UID: \"a7c60c91-094d-4c52-9dcb-36ad07c829ad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038459 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-service-ca\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038474 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-config\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038491 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-audit-policies\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038509 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fw49\" (UniqueName: \"kubernetes.io/projected/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-kube-api-access-6fw49\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038526 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm4jh\" (UniqueName: \"kubernetes.io/projected/bb8c0212-2a6d-4636-a75b-08a350f5948f-kube-api-access-cm4jh\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038542 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038561 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r28vr\" (UniqueName: \"kubernetes.io/projected/612ac789-5007-4e17-a81a-cf753c2acadc-kube-api-access-r28vr\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038578 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b-config\") pod \"console-operator-58897d9998-ljm6r\" (UID: \"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b\") " pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038602 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vstnp\" (UniqueName: \"kubernetes.io/projected/46da50cb-1038-4289-be6d-e5f3b4c70ab3-kube-api-access-vstnp\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038621 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-config\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038636 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038654 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb8c0212-2a6d-4636-a75b-08a350f5948f-serving-cert\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038686 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9beea563-7739-4b17-b360-bb769400bdff-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ccz2v\" (UID: \"9beea563-7739-4b17-b360-bb769400bdff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038711 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4xvf\" (UniqueName: \"kubernetes.io/projected/a004c43d-7acc-4a7e-afc1-947c31df55ad-kube-api-access-q4xvf\") pod \"control-plane-machine-set-operator-78cbb6b69f-c8tfd\" (UID: \"a004c43d-7acc-4a7e-afc1-947c31df55ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038745 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f291eed-0e60-43a7-a34a-2f7ad9788126-service-ca-bundle\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038771 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rs9x\" (UniqueName: \"kubernetes.io/projected/f2470875-024a-4ef0-9e01-20bbbfff60bc-kube-api-access-8rs9x\") pod \"dns-operator-744455d44c-x2rhs\" (UID: \"f2470875-024a-4ef0-9e01-20bbbfff60bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-x2rhs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038795 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shw46\" (UniqueName: \"kubernetes.io/projected/b14f5741-27aa-4ff2-a2dc-69c385a07e16-kube-api-access-shw46\") pod \"machine-approver-56656f9798-hr792\" (UID: \"b14f5741-27aa-4ff2-a2dc-69c385a07e16\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038840 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038860 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038883 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2470875-024a-4ef0-9e01-20bbbfff60bc-metrics-tls\") pod \"dns-operator-744455d44c-x2rhs\" (UID: \"f2470875-024a-4ef0-9e01-20bbbfff60bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-x2rhs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038902 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-serving-cert\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038917 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/48b6255a-3390-4e84-bed2-6a28fc0c9800-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6959d\" (UID: \"48b6255a-3390-4e84-bed2-6a28fc0c9800\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038935 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038951 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038969 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038988 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-serving-cert\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039004 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wb5b\" (UniqueName: \"kubernetes.io/projected/44081086-ee5b-4b26-8af9-f35aa03402fc-kube-api-access-4wb5b\") pod \"migrator-59844c95c7-ntlhm\" (UID: \"44081086-ee5b-4b26-8af9-f35aa03402fc\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ntlhm" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039021 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/451b0ac4-065b-46e3-813e-7b62a311e7eb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lhtqh\" (UID: \"451b0ac4-065b-46e3-813e-7b62a311e7eb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039043 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k4jd\" (UniqueName: \"kubernetes.io/projected/37fe4f8d-b1a8-4848-805b-f44095a2daeb-kube-api-access-4k4jd\") pod \"packageserver-d55dfcdfc-cp52v\" (UID: \"37fe4f8d-b1a8-4848-805b-f44095a2daeb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039073 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-oauth-config\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039091 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x4nd\" (UniqueName: \"kubernetes.io/projected/79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b-kube-api-access-4x4nd\") pod \"console-operator-58897d9998-ljm6r\" (UID: \"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b\") " pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039107 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/451b0ac4-065b-46e3-813e-7b62a311e7eb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lhtqh\" (UID: \"451b0ac4-065b-46e3-813e-7b62a311e7eb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039131 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b14f5741-27aa-4ff2-a2dc-69c385a07e16-auth-proxy-config\") pod \"machine-approver-56656f9798-hr792\" (UID: \"b14f5741-27aa-4ff2-a2dc-69c385a07e16\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039150 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-etcd-client\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039166 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-audit-dir\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039183 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/37fe4f8d-b1a8-4848-805b-f44095a2daeb-tmpfs\") pod \"packageserver-d55dfcdfc-cp52v\" (UID: \"37fe4f8d-b1a8-4848-805b-f44095a2daeb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039202 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b14f5741-27aa-4ff2-a2dc-69c385a07e16-config\") pod \"machine-approver-56656f9798-hr792\" (UID: \"b14f5741-27aa-4ff2-a2dc-69c385a07e16\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039217 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-client-ca\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039234 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-oauth-serving-cert\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039250 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-config\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039581 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-image-import-ca\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039658 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-node-pullsecrets\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.039799 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-etcd-serving-ca\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.040371 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-audit-policies\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.041295 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-config\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.041917 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b14f5741-27aa-4ff2-a2dc-69c385a07e16-auth-proxy-config\") pod \"machine-approver-56656f9798-hr792\" (UID: \"b14f5741-27aa-4ff2-a2dc-69c385a07e16\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.044330 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b14f5741-27aa-4ff2-a2dc-69c385a07e16-machine-approver-tls\") pod \"machine-approver-56656f9798-hr792\" (UID: \"b14f5741-27aa-4ff2-a2dc-69c385a07e16\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.044384 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7c60c91-094d-4c52-9dcb-36ad07c829ad-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5bq7n\" (UID: \"a7c60c91-094d-4c52-9dcb-36ad07c829ad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.044451 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/612ac789-5007-4e17-a81a-cf753c2acadc-audit-dir\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.044384 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.044960 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/48b6255a-3390-4e84-bed2-6a28fc0c9800-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6959d\" (UID: \"48b6255a-3390-4e84-bed2-6a28fc0c9800\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.045263 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a7c60c91-094d-4c52-9dcb-36ad07c829ad-images\") pod \"machine-api-operator-5694c8668f-5bq7n\" (UID: \"a7c60c91-094d-4c52-9dcb-36ad07c829ad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.045509 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-etcd-client\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.045596 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-audit-dir\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.046308 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-etcd-service-ca\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.046369 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b14f5741-27aa-4ff2-a2dc-69c385a07e16-config\") pod \"machine-approver-56656f9798-hr792\" (UID: \"b14f5741-27aa-4ff2-a2dc-69c385a07e16\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.047010 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-serving-cert\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.047755 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-audit\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.048458 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-client-ca\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.048673 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-encryption-config\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.052614 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-oauth-serving-cert\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.052674 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b-serving-cert\") pod \"console-operator-58897d9998-ljm6r\" (UID: \"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b\") " pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.053743 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.054373 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-oauth-config\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.054569 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-etcd-client\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.055084 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f291eed-0e60-43a7-a34a-2f7ad9788126-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.054871 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-serving-cert\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.055626 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-etcd-ca\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.038071 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-trusted-ca-bundle\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.055914 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b-config\") pod \"console-operator-58897d9998-ljm6r\" (UID: \"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b\") " pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.056301 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e25380ea-ea92-42bb-bd73-4da399dc0cc4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lbbxx\" (UID: \"e25380ea-ea92-42bb-bd73-4da399dc0cc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.056487 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-config\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.057467 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.058104 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f291eed-0e60-43a7-a34a-2f7ad9788126-service-ca-bundle\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.058117 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-config\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.058318 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b-trusted-ca\") pod \"console-operator-58897d9998-ljm6r\" (UID: \"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b\") " pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.058462 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7c60c91-094d-4c52-9dcb-36ad07c829ad-config\") pod \"machine-api-operator-5694c8668f-5bq7n\" (UID: \"a7c60c91-094d-4c52-9dcb-36ad07c829ad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.059006 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f291eed-0e60-43a7-a34a-2f7ad9788126-config\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.059293 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.059336 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-service-ca\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.059382 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.059728 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.060012 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-config\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.060182 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.060629 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.060686 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.062043 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb8c0212-2a6d-4636-a75b-08a350f5948f-serving-cert\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.062591 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.062901 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8lsvm"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.063985 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8lsvm" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.064985 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v76pm"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.066180 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.066297 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.066372 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.066514 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-x2rhs"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.067020 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-serving-cert\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.068497 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f291eed-0e60-43a7-a34a-2f7ad9788126-serving-cert\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.068587 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8n7vr"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.068730 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9beea563-7739-4b17-b360-bb769400bdff-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ccz2v\" (UID: \"9beea563-7739-4b17-b360-bb769400bdff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.069114 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2470875-024a-4ef0-9e01-20bbbfff60bc-metrics-tls\") pod \"dns-operator-744455d44c-x2rhs\" (UID: \"f2470875-024a-4ef0-9e01-20bbbfff60bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-x2rhs" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.069511 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.070328 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.073475 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b6255a-3390-4e84-bed2-6a28fc0c9800-serving-cert\") pod \"openshift-config-operator-7777fb866f-6959d\" (UID: \"48b6255a-3390-4e84-bed2-6a28fc0c9800\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.074298 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cdrc6"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.074827 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.075012 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.075754 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ljm6r"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.076254 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.076873 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.080068 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.087168 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cb2gw"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.088582 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.090742 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ntlhm"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.092015 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.095187 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.095240 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8tds9"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.097523 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6959d"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.102450 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.108050 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.112608 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5tr44"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.114270 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.115675 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.117304 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.118008 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.119361 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.120561 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.120860 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nwzjs"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.122100 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.123239 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-skhtt"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.124368 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-skhtt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.124834 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8lsvm"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.126027 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.127155 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.129327 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wpb7c"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.130561 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-29stc"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.132146 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.133282 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7x6zc"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.134704 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.135889 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547774-2b2fh"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.137049 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.138212 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.139513 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140062 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/451b0ac4-065b-46e3-813e-7b62a311e7eb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lhtqh\" (UID: \"451b0ac4-065b-46e3-813e-7b62a311e7eb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140105 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/37fe4f8d-b1a8-4848-805b-f44095a2daeb-tmpfs\") pod \"packageserver-d55dfcdfc-cp52v\" (UID: \"37fe4f8d-b1a8-4848-805b-f44095a2daeb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140067 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140158 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/755ab720-3d7a-4d86-8613-147ac07a93dd-config\") pod \"kube-controller-manager-operator-78b949d7b-qhnm7\" (UID: \"755ab720-3d7a-4d86-8613-147ac07a93dd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140178 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/755ab720-3d7a-4d86-8613-147ac07a93dd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qhnm7\" (UID: \"755ab720-3d7a-4d86-8613-147ac07a93dd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140194 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37fe4f8d-b1a8-4848-805b-f44095a2daeb-webhook-cert\") pod \"packageserver-d55dfcdfc-cp52v\" (UID: \"37fe4f8d-b1a8-4848-805b-f44095a2daeb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140219 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/451b0ac4-065b-46e3-813e-7b62a311e7eb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lhtqh\" (UID: \"451b0ac4-065b-46e3-813e-7b62a311e7eb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140239 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a004c43d-7acc-4a7e-afc1-947c31df55ad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c8tfd\" (UID: \"a004c43d-7acc-4a7e-afc1-947c31df55ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140268 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m24s\" (UniqueName: \"kubernetes.io/projected/451b0ac4-065b-46e3-813e-7b62a311e7eb-kube-api-access-6m24s\") pod \"cluster-image-registry-operator-dc59b4c8b-lhtqh\" (UID: \"451b0ac4-065b-46e3-813e-7b62a311e7eb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140321 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37fe4f8d-b1a8-4848-805b-f44095a2daeb-apiservice-cert\") pod \"packageserver-d55dfcdfc-cp52v\" (UID: \"37fe4f8d-b1a8-4848-805b-f44095a2daeb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140339 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/755ab720-3d7a-4d86-8613-147ac07a93dd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qhnm7\" (UID: \"755ab720-3d7a-4d86-8613-147ac07a93dd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140382 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4xvf\" (UniqueName: \"kubernetes.io/projected/a004c43d-7acc-4a7e-afc1-947c31df55ad-kube-api-access-q4xvf\") pod \"control-plane-machine-set-operator-78cbb6b69f-c8tfd\" (UID: \"a004c43d-7acc-4a7e-afc1-947c31df55ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140432 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/451b0ac4-065b-46e3-813e-7b62a311e7eb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lhtqh\" (UID: \"451b0ac4-065b-46e3-813e-7b62a311e7eb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140464 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k4jd\" (UniqueName: \"kubernetes.io/projected/37fe4f8d-b1a8-4848-805b-f44095a2daeb-kube-api-access-4k4jd\") pod \"packageserver-d55dfcdfc-cp52v\" (UID: \"37fe4f8d-b1a8-4848-805b-f44095a2daeb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.140770 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/37fe4f8d-b1a8-4848-805b-f44095a2daeb-tmpfs\") pod \"packageserver-d55dfcdfc-cp52v\" (UID: \"37fe4f8d-b1a8-4848-805b-f44095a2daeb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.141529 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/755ab720-3d7a-4d86-8613-147ac07a93dd-config\") pod \"kube-controller-manager-operator-78b949d7b-qhnm7\" (UID: \"755ab720-3d7a-4d86-8613-147ac07a93dd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.142079 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-m9vbd"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.142682 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/451b0ac4-065b-46e3-813e-7b62a311e7eb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lhtqh\" (UID: \"451b0ac4-065b-46e3-813e-7b62a311e7eb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.143235 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m9vbd" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.144286 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vsntc"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.145193 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37fe4f8d-b1a8-4848-805b-f44095a2daeb-apiservice-cert\") pod \"packageserver-d55dfcdfc-cp52v\" (UID: \"37fe4f8d-b1a8-4848-805b-f44095a2daeb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.146020 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37fe4f8d-b1a8-4848-805b-f44095a2daeb-webhook-cert\") pod \"packageserver-d55dfcdfc-cp52v\" (UID: \"37fe4f8d-b1a8-4848-805b-f44095a2daeb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.149260 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/451b0ac4-065b-46e3-813e-7b62a311e7eb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lhtqh\" (UID: \"451b0ac4-065b-46e3-813e-7b62a311e7eb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.149934 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/755ab720-3d7a-4d86-8613-147ac07a93dd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qhnm7\" (UID: \"755ab720-3d7a-4d86-8613-147ac07a93dd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.150714 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m9vbd"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.151001 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.151394 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vsntc"] Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.160842 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.180417 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.203034 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.221130 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.240934 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.261383 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.280632 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.300368 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.321174 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.340716 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.361722 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.380662 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.400465 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.421893 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.441068 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.460818 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.480869 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.500755 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.520697 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.540735 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.560803 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.580245 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.600464 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.604885 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a004c43d-7acc-4a7e-afc1-947c31df55ad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c8tfd\" (UID: \"a004c43d-7acc-4a7e-afc1-947c31df55ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.640762 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.660770 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.680882 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.700574 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.721699 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.740581 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.761587 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.781159 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.801074 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.821968 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.841690 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.860773 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.881111 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.901152 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.920048 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.947249 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.958540 4941 request.go:700] Waited for 1.009807314s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.960306 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 07 06:55:08 crc kubenswrapper[4941]: I0307 06:55:08.981382 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.001605 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.022491 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.041588 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.061908 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.080658 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.100008 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.121603 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.141617 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.171585 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.181220 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.202046 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.221120 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.241372 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.261477 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.280324 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.300865 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.320468 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.378964 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llhvz\" (UniqueName: \"kubernetes.io/projected/e25380ea-ea92-42bb-bd73-4da399dc0cc4-kube-api-access-llhvz\") pod \"openshift-apiserver-operator-796bbdcf4f-lbbxx\" (UID: \"e25380ea-ea92-42bb-bd73-4da399dc0cc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.381223 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.384857 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.401478 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.422004 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.459807 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plk2b\" (UniqueName: \"kubernetes.io/projected/48b6255a-3390-4e84-bed2-6a28fc0c9800-kube-api-access-plk2b\") pod \"openshift-config-operator-7777fb866f-6959d\" (UID: \"48b6255a-3390-4e84-bed2-6a28fc0c9800\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.461083 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.481673 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.500414 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.520985 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.541681 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.545501 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.562306 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.581933 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.601126 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.621828 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.661168 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x4nd\" (UniqueName: \"kubernetes.io/projected/79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b-kube-api-access-4x4nd\") pod \"console-operator-58897d9998-ljm6r\" (UID: \"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b\") " pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.666541 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx"] Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.674644 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r28vr\" (UniqueName: \"kubernetes.io/projected/612ac789-5007-4e17-a81a-cf753c2acadc-kube-api-access-r28vr\") pod \"oauth-openshift-558db77b4-8n7vr\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:09 crc kubenswrapper[4941]: W0307 06:55:09.679051 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode25380ea_ea92_42bb_bd73_4da399dc0cc4.slice/crio-826421adadd47039e01d07c8c78f7af8d70e5ca20de720ce14abab3aec92db73 WatchSource:0}: Error finding container 826421adadd47039e01d07c8c78f7af8d70e5ca20de720ce14abab3aec92db73: Status 404 returned error can't find the container with id 826421adadd47039e01d07c8c78f7af8d70e5ca20de720ce14abab3aec92db73 Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.695796 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fw49\" (UniqueName: \"kubernetes.io/projected/a5d1706b-179e-4ffd-a2af-e62d05e1e36d-kube-api-access-6fw49\") pod \"apiserver-76f77b778f-j2bnz\" (UID: \"a5d1706b-179e-4ffd-a2af-e62d05e1e36d\") " pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.718078 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm4jh\" (UniqueName: \"kubernetes.io/projected/bb8c0212-2a6d-4636-a75b-08a350f5948f-kube-api-access-cm4jh\") pod \"controller-manager-879f6c89f-v76pm\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.744195 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6959d"] Mar 07 06:55:09 crc kubenswrapper[4941]: W0307 06:55:09.757279 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48b6255a_3390_4e84_bed2_6a28fc0c9800.slice/crio-74f866941f3a094dc79998cb4f4ca64dfdb2ea008468d76d3ecb06524402d027 WatchSource:0}: Error finding container 74f866941f3a094dc79998cb4f4ca64dfdb2ea008468d76d3ecb06524402d027: Status 404 returned error can't find the container with id 74f866941f3a094dc79998cb4f4ca64dfdb2ea008468d76d3ecb06524402d027 Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.773332 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.778178 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzrp4\" (UniqueName: \"kubernetes.io/projected/e51d2d8f-21d6-41a5-a10c-caf3fa669e76-kube-api-access-mzrp4\") pod \"etcd-operator-b45778765-cdrc6\" (UID: \"e51d2d8f-21d6-41a5-a10c-caf3fa669e76\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.784587 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wb5b\" (UniqueName: \"kubernetes.io/projected/44081086-ee5b-4b26-8af9-f35aa03402fc-kube-api-access-4wb5b\") pod \"migrator-59844c95c7-ntlhm\" (UID: \"44081086-ee5b-4b26-8af9-f35aa03402fc\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ntlhm" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.785494 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vstnp\" (UniqueName: \"kubernetes.io/projected/46da50cb-1038-4289-be6d-e5f3b4c70ab3-kube-api-access-vstnp\") pod \"console-f9d7485db-nwzjs\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.786530 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ntlhm" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.799754 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg9j4\" (UniqueName: \"kubernetes.io/projected/a7c60c91-094d-4c52-9dcb-36ad07c829ad-kube-api-access-rg9j4\") pod \"machine-api-operator-5694c8668f-5bq7n\" (UID: \"a7c60c91-094d-4c52-9dcb-36ad07c829ad\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.821315 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shw46\" (UniqueName: \"kubernetes.io/projected/b14f5741-27aa-4ff2-a2dc-69c385a07e16-kube-api-access-shw46\") pod \"machine-approver-56656f9798-hr792\" (UID: \"b14f5741-27aa-4ff2-a2dc-69c385a07e16\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.832649 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.837357 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rs9x\" (UniqueName: \"kubernetes.io/projected/f2470875-024a-4ef0-9e01-20bbbfff60bc-kube-api-access-8rs9x\") pod \"dns-operator-744455d44c-x2rhs\" (UID: \"f2470875-024a-4ef0-9e01-20bbbfff60bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-x2rhs" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.914990 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.920361 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.920545 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.920764 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.921495 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.923143 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x54rk\" (UniqueName: \"kubernetes.io/projected/3f291eed-0e60-43a7-a34a-2f7ad9788126-kube-api-access-x54rk\") pod \"authentication-operator-69f744f599-hvmvs\" (UID: \"3f291eed-0e60-43a7-a34a-2f7ad9788126\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.936656 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.938957 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5gwt\" (UniqueName: \"kubernetes.io/projected/9beea563-7739-4b17-b360-bb769400bdff-kube-api-access-r5gwt\") pod \"cluster-samples-operator-665b6dd947-ccz2v\" (UID: \"9beea563-7739-4b17-b360-bb769400bdff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.941364 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.944699 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.959691 4941 request.go:700] Waited for 1.835048578s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dnode-bootstrapper-token&limit=500&resourceVersion=0 Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.962120 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.962592 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v" Mar 07 06:55:09 crc kubenswrapper[4941]: I0307 06:55:09.983005 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.003321 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.010766 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.036115 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-x2rhs" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.040004 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" event={"ID":"e25380ea-ea92-42bb-bd73-4da399dc0cc4","Type":"ContainerStarted","Data":"826421adadd47039e01d07c8c78f7af8d70e5ca20de720ce14abab3aec92db73"} Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.041508 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k4jd\" (UniqueName: \"kubernetes.io/projected/37fe4f8d-b1a8-4848-805b-f44095a2daeb-kube-api-access-4k4jd\") pod \"packageserver-d55dfcdfc-cp52v\" (UID: \"37fe4f8d-b1a8-4848-805b-f44095a2daeb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.044376 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" event={"ID":"48b6255a-3390-4e84-bed2-6a28fc0c9800","Type":"ContainerStarted","Data":"74f866941f3a094dc79998cb4f4ca64dfdb2ea008468d76d3ecb06524402d027"} Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.048016 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.058549 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ljm6r"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.060336 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/451b0ac4-065b-46e3-813e-7b62a311e7eb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lhtqh\" (UID: \"451b0ac4-065b-46e3-813e-7b62a311e7eb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.078340 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m24s\" (UniqueName: \"kubernetes.io/projected/451b0ac4-065b-46e3-813e-7b62a311e7eb-kube-api-access-6m24s\") pod \"cluster-image-registry-operator-dc59b4c8b-lhtqh\" (UID: \"451b0ac4-065b-46e3-813e-7b62a311e7eb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.091516 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ntlhm"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.098997 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4xvf\" (UniqueName: \"kubernetes.io/projected/a004c43d-7acc-4a7e-afc1-947c31df55ad-kube-api-access-q4xvf\") pod \"control-plane-machine-set-operator-78cbb6b69f-c8tfd\" (UID: \"a004c43d-7acc-4a7e-afc1-947c31df55ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.119898 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/755ab720-3d7a-4d86-8613-147ac07a93dd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qhnm7\" (UID: \"755ab720-3d7a-4d86-8613-147ac07a93dd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.127641 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.132067 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cdrc6"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.148012 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.161633 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.181216 4941 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 07 06:55:10 crc kubenswrapper[4941]: W0307 06:55:10.184301 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51d2d8f_21d6_41a5_a10c_caf3fa669e76.slice/crio-e3669444e38efd11b4d7e0c446a5da7e8a9ed20d80ea29bed118fa8ade16ea68 WatchSource:0}: Error finding container e3669444e38efd11b4d7e0c446a5da7e8a9ed20d80ea29bed118fa8ade16ea68: Status 404 returned error can't find the container with id e3669444e38efd11b4d7e0c446a5da7e8a9ed20d80ea29bed118fa8ade16ea68 Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.211229 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.211886 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.217171 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.222370 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.274523 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.282026 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.314471 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.314557 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.324131 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0cf41cff-9af9-423f-8e57-117983f90b7b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.324201 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cf41cff-9af9-423f-8e57-117983f90b7b-trusted-ca\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.324257 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87cwj\" (UniqueName: \"kubernetes.io/projected/45977497-a17d-40f6-bf09-22aff4a45738-kube-api-access-87cwj\") pod \"machine-config-operator-74547568cd-csdsw\" (UID: \"45977497-a17d-40f6-bf09-22aff4a45738\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.324356 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnml\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-kube-api-access-rvnml\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.324383 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw4wf\" (UniqueName: \"kubernetes.io/projected/b0b8ae0c-7ebc-4db9-a622-ab41e63e0690-kube-api-access-qw4wf\") pod \"multus-admission-controller-857f4d67dd-cb2gw\" (UID: \"b0b8ae0c-7ebc-4db9-a622-ab41e63e0690\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cb2gw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.324459 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/915093d7-a9dd-4f42-8b79-c887eaf983fe-srv-cert\") pod \"catalog-operator-68c6474976-7vzg6\" (UID: \"915093d7-a9dd-4f42-8b79-c887eaf983fe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.324482 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90f01d22-70a1-4a37-b942-722fd80ed583-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8qnmz\" (UID: \"90f01d22-70a1-4a37-b942-722fd80ed583\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.324544 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-w9tck\" (UID: \"0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.324606 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f01d22-70a1-4a37-b942-722fd80ed583-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8qnmz\" (UID: \"90f01d22-70a1-4a37-b942-722fd80ed583\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.324707 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.324757 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-encryption-config\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.324779 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-serving-cert\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.324933 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45977497-a17d-40f6-bf09-22aff4a45738-proxy-tls\") pod \"machine-config-operator-74547568cd-csdsw\" (UID: \"45977497-a17d-40f6-bf09-22aff4a45738\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.325530 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.326011 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmrkj\" (UniqueName: \"kubernetes.io/projected/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-kube-api-access-pmrkj\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.326051 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-config-volume\") pod \"collect-profiles-29547765-2wh77\" (UID: \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.326071 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/45977497-a17d-40f6-bf09-22aff4a45738-images\") pod \"machine-config-operator-74547568cd-csdsw\" (UID: \"45977497-a17d-40f6-bf09-22aff4a45738\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.326151 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2p2v\" (UniqueName: \"kubernetes.io/projected/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-kube-api-access-z2p2v\") pod \"collect-profiles-29547765-2wh77\" (UID: \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.326185 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-w9tck\" (UID: \"0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.326256 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-audit-dir\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.326739 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s7jf\" (UniqueName: \"kubernetes.io/projected/0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8-kube-api-access-9s7jf\") pod \"kube-storage-version-migrator-operator-b67b599dd-w9tck\" (UID: \"0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.326778 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9zws\" (UniqueName: \"kubernetes.io/projected/a77c6084-94de-4ebc-9a75-a83efa28b094-kube-api-access-r9zws\") pod \"downloads-7954f5f757-5tr44\" (UID: \"a77c6084-94de-4ebc-9a75-a83efa28b094\") " pod="openshift-console/downloads-7954f5f757-5tr44" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.327880 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0cf41cff-9af9-423f-8e57-117983f90b7b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.328159 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.328606 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b0b8ae0c-7ebc-4db9-a622-ab41e63e0690-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cb2gw\" (UID: \"b0b8ae0c-7ebc-4db9-a622-ab41e63e0690\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cb2gw" Mar 07 06:55:10 crc kubenswrapper[4941]: E0307 06:55:10.328812 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:10.828763924 +0000 UTC m=+207.781129589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.329089 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-etcd-client\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.329666 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-registry-tls\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.329704 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmqtf\" (UniqueName: \"kubernetes.io/projected/915093d7-a9dd-4f42-8b79-c887eaf983fe-kube-api-access-jmqtf\") pod \"catalog-operator-68c6474976-7vzg6\" (UID: \"915093d7-a9dd-4f42-8b79-c887eaf983fe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.329729 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/45977497-a17d-40f6-bf09-22aff4a45738-auth-proxy-config\") pod \"machine-config-operator-74547568cd-csdsw\" (UID: \"45977497-a17d-40f6-bf09-22aff4a45738\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.329754 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-secret-volume\") pod \"collect-profiles-29547765-2wh77\" (UID: \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.329801 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/915093d7-a9dd-4f42-8b79-c887eaf983fe-profile-collector-cert\") pod \"catalog-operator-68c6474976-7vzg6\" (UID: \"915093d7-a9dd-4f42-8b79-c887eaf983fe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.329878 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0cf41cff-9af9-423f-8e57-117983f90b7b-registry-certificates\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.329905 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvcxz\" (UniqueName: \"kubernetes.io/projected/90f01d22-70a1-4a37-b942-722fd80ed583-kube-api-access-cvcxz\") pod \"openshift-controller-manager-operator-756b6f6bc6-8qnmz\" (UID: \"90f01d22-70a1-4a37-b942-722fd80ed583\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.329931 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-bound-sa-token\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.329956 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-audit-policies\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.330424 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.353785 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.431031 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.431282 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b0b8ae0c-7ebc-4db9-a622-ab41e63e0690-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cb2gw\" (UID: \"b0b8ae0c-7ebc-4db9-a622-ab41e63e0690\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cb2gw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.431322 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-etcd-client\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.431353 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0-serving-cert\") pod \"service-ca-operator-777779d784-29stc\" (UID: \"5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.431379 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17-metrics-tls\") pod \"dns-default-m9vbd\" (UID: \"25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17\") " pod="openshift-dns/dns-default-m9vbd" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.431452 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/743d3e91-514c-4323-9fce-9d29d4d6e816-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2kb4\" (UID: \"743d3e91-514c-4323-9fce-9d29d4d6e816\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.431487 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6cedfec9-da7b-4080-894f-dbc3b3dbad46-signing-cabundle\") pod \"service-ca-9c57cc56f-8tds9\" (UID: \"6cedfec9-da7b-4080-894f-dbc3b3dbad46\") " pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.431515 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-registry-tls\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.431543 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6vcj\" (UniqueName: \"kubernetes.io/projected/6cedfec9-da7b-4080-894f-dbc3b3dbad46-kube-api-access-g6vcj\") pod \"service-ca-9c57cc56f-8tds9\" (UID: \"6cedfec9-da7b-4080-894f-dbc3b3dbad46\") " pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.431590 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmqtf\" (UniqueName: \"kubernetes.io/projected/915093d7-a9dd-4f42-8b79-c887eaf983fe-kube-api-access-jmqtf\") pod \"catalog-operator-68c6474976-7vzg6\" (UID: \"915093d7-a9dd-4f42-8b79-c887eaf983fe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.431618 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/45977497-a17d-40f6-bf09-22aff4a45738-auth-proxy-config\") pod \"machine-config-operator-74547568cd-csdsw\" (UID: \"45977497-a17d-40f6-bf09-22aff4a45738\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.431644 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-secret-volume\") pod \"collect-profiles-29547765-2wh77\" (UID: \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.431670 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07b5dbbc-b835-437d-bc58-1f4dc88e48bc-proxy-tls\") pod \"machine-config-controller-84d6567774-rxqwq\" (UID: \"07b5dbbc-b835-437d-bc58-1f4dc88e48bc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" Mar 07 06:55:10 crc kubenswrapper[4941]: E0307 06:55:10.431743 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:10.931710831 +0000 UTC m=+207.884076296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.432870 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/45977497-a17d-40f6-bf09-22aff4a45738-auth-proxy-config\") pod \"machine-config-operator-74547568cd-csdsw\" (UID: \"45977497-a17d-40f6-bf09-22aff4a45738\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434220 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/915093d7-a9dd-4f42-8b79-c887eaf983fe-profile-collector-cert\") pod \"catalog-operator-68c6474976-7vzg6\" (UID: \"915093d7-a9dd-4f42-8b79-c887eaf983fe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434326 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-mountpoint-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434348 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb104051-0f66-4faf-95f5-a7a28d41cdc2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s8sj2\" (UID: \"fb104051-0f66-4faf-95f5-a7a28d41cdc2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434459 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bffb6535-e060-4328-8e0d-0b8bd64c656b-default-certificate\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434563 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvcxz\" (UniqueName: \"kubernetes.io/projected/90f01d22-70a1-4a37-b942-722fd80ed583-kube-api-access-cvcxz\") pod \"openshift-controller-manager-operator-756b6f6bc6-8qnmz\" (UID: \"90f01d22-70a1-4a37-b942-722fd80ed583\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434606 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdhnl\" (UniqueName: \"kubernetes.io/projected/007a3578-3ca3-4979-aa06-b2ffe3c7718e-kube-api-access-gdhnl\") pod \"olm-operator-6b444d44fb-75pn8\" (UID: \"007a3578-3ca3-4979-aa06-b2ffe3c7718e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434685 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0cf41cff-9af9-423f-8e57-117983f90b7b-registry-certificates\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434748 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l9gg\" (UniqueName: \"kubernetes.io/projected/baf7bbe6-5859-4df3-9164-a62bb2333078-kube-api-access-9l9gg\") pod \"marketplace-operator-79b997595-7x6zc\" (UID: \"baf7bbe6-5859-4df3-9164-a62bb2333078\") " pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434782 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-bound-sa-token\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434800 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-audit-policies\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434819 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/007a3578-3ca3-4979-aa06-b2ffe3c7718e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-75pn8\" (UID: \"007a3578-3ca3-4979-aa06-b2ffe3c7718e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434876 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0cf41cff-9af9-423f-8e57-117983f90b7b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434898 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cf41cff-9af9-423f-8e57-117983f90b7b-trusted-ca\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434917 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87cwj\" (UniqueName: \"kubernetes.io/projected/45977497-a17d-40f6-bf09-22aff4a45738-kube-api-access-87cwj\") pod \"machine-config-operator-74547568cd-csdsw\" (UID: \"45977497-a17d-40f6-bf09-22aff4a45738\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.434970 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc1105d7-08aa-4d81-9bb1-e162f649529d-cert\") pod \"ingress-canary-8lsvm\" (UID: \"dc1105d7-08aa-4d81-9bb1-e162f649529d\") " pod="openshift-ingress-canary/ingress-canary-8lsvm" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.435207 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnml\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-kube-api-access-rvnml\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.435247 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dszkg\" (UniqueName: \"kubernetes.io/projected/dc1105d7-08aa-4d81-9bb1-e162f649529d-kube-api-access-dszkg\") pod \"ingress-canary-8lsvm\" (UID: \"dc1105d7-08aa-4d81-9bb1-e162f649529d\") " pod="openshift-ingress-canary/ingress-canary-8lsvm" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.435266 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmqb5\" (UniqueName: \"kubernetes.io/projected/fdca2db2-c710-4685-9033-fbbe40f73076-kube-api-access-pmqb5\") pod \"route-controller-manager-6576b87f9c-zm7wn\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.435288 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-csi-data-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.435312 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw4wf\" (UniqueName: \"kubernetes.io/projected/b0b8ae0c-7ebc-4db9-a622-ab41e63e0690-kube-api-access-qw4wf\") pod \"multus-admission-controller-857f4d67dd-cb2gw\" (UID: \"b0b8ae0c-7ebc-4db9-a622-ab41e63e0690\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cb2gw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.435362 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/915093d7-a9dd-4f42-8b79-c887eaf983fe-srv-cert\") pod \"catalog-operator-68c6474976-7vzg6\" (UID: \"915093d7-a9dd-4f42-8b79-c887eaf983fe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.435681 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0cf41cff-9af9-423f-8e57-117983f90b7b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.436251 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0-config\") pod \"service-ca-operator-777779d784-29stc\" (UID: \"5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.436423 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90f01d22-70a1-4a37-b942-722fd80ed583-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8qnmz\" (UID: \"90f01d22-70a1-4a37-b942-722fd80ed583\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.436546 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb104051-0f66-4faf-95f5-a7a28d41cdc2-config\") pod \"kube-apiserver-operator-766d6c64bb-s8sj2\" (UID: \"fb104051-0f66-4faf-95f5-a7a28d41cdc2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.436658 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-w9tck\" (UID: \"0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.436696 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/743d3e91-514c-4323-9fce-9d29d4d6e816-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2kb4\" (UID: \"743d3e91-514c-4323-9fce-9d29d4d6e816\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.436734 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6cedfec9-da7b-4080-894f-dbc3b3dbad46-signing-key\") pod \"service-ca-9c57cc56f-8tds9\" (UID: \"6cedfec9-da7b-4080-894f-dbc3b3dbad46\") " pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.436764 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxhjd\" (UniqueName: \"kubernetes.io/projected/7566a1ff-1f6b-4e99-903b-ff036f98c411-kube-api-access-sxhjd\") pod \"auto-csr-approver-29547774-2b2fh\" (UID: \"7566a1ff-1f6b-4e99-903b-ff036f98c411\") " pod="openshift-infra/auto-csr-approver-29547774-2b2fh" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.436778 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-audit-policies\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.436794 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjbml\" (UniqueName: \"kubernetes.io/projected/07b5dbbc-b835-437d-bc58-1f4dc88e48bc-kube-api-access-gjbml\") pod \"machine-config-controller-84d6567774-rxqwq\" (UID: \"07b5dbbc-b835-437d-bc58-1f4dc88e48bc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.437289 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flt9l\" (UniqueName: \"kubernetes.io/projected/5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0-kube-api-access-flt9l\") pod \"service-ca-operator-777779d784-29stc\" (UID: \"5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.443310 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0cf41cff-9af9-423f-8e57-117983f90b7b-registry-certificates\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.453289 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cf41cff-9af9-423f-8e57-117983f90b7b-trusted-ca\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.456819 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b0b8ae0c-7ebc-4db9-a622-ab41e63e0690-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cb2gw\" (UID: \"b0b8ae0c-7ebc-4db9-a622-ab41e63e0690\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cb2gw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.457308 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-etcd-client\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.457707 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90f01d22-70a1-4a37-b942-722fd80ed583-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8qnmz\" (UID: \"90f01d22-70a1-4a37-b942-722fd80ed583\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.457875 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-secret-volume\") pod \"collect-profiles-29547765-2wh77\" (UID: \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.457973 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f01d22-70a1-4a37-b942-722fd80ed583-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8qnmz\" (UID: \"90f01d22-70a1-4a37-b942-722fd80ed583\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458020 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4s28\" (UniqueName: \"kubernetes.io/projected/25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17-kube-api-access-j4s28\") pod \"dns-default-m9vbd\" (UID: \"25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17\") " pod="openshift-dns/dns-default-m9vbd" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458055 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdca2db2-c710-4685-9033-fbbe40f73076-serving-cert\") pod \"route-controller-manager-6576b87f9c-zm7wn\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458122 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458155 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-encryption-config\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458184 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-serving-cert\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458211 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb104051-0f66-4faf-95f5-a7a28d41cdc2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s8sj2\" (UID: \"fb104051-0f66-4faf-95f5-a7a28d41cdc2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458236 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc64ec5b-3c7b-47de-8a6f-6c2555ba6465-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p666q\" (UID: \"dc64ec5b-3c7b-47de-8a6f-6c2555ba6465\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458274 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bffb6535-e060-4328-8e0d-0b8bd64c656b-metrics-certs\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458323 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45977497-a17d-40f6-bf09-22aff4a45738-proxy-tls\") pod \"machine-config-operator-74547568cd-csdsw\" (UID: \"45977497-a17d-40f6-bf09-22aff4a45738\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458373 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdca2db2-c710-4685-9033-fbbe40f73076-client-ca\") pod \"route-controller-manager-6576b87f9c-zm7wn\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458442 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fe2bbec6-3204-4ad3-bbcc-bdc27060c44b-node-bootstrap-token\") pod \"machine-config-server-skhtt\" (UID: \"fe2bbec6-3204-4ad3-bbcc-bdc27060c44b\") " pod="openshift-machine-config-operator/machine-config-server-skhtt" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458476 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458538 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhsnj\" (UniqueName: \"kubernetes.io/projected/bffb6535-e060-4328-8e0d-0b8bd64c656b-kube-api-access-rhsnj\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458568 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fe2bbec6-3204-4ad3-bbcc-bdc27060c44b-certs\") pod \"machine-config-server-skhtt\" (UID: \"fe2bbec6-3204-4ad3-bbcc-bdc27060c44b\") " pod="openshift-machine-config-operator/machine-config-server-skhtt" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458578 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-w9tck\" (UID: \"0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458691 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-config-volume\") pod \"collect-profiles-29547765-2wh77\" (UID: \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458726 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmrkj\" (UniqueName: \"kubernetes.io/projected/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-kube-api-access-pmrkj\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458762 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/45977497-a17d-40f6-bf09-22aff4a45738-images\") pod \"machine-config-operator-74547568cd-csdsw\" (UID: \"45977497-a17d-40f6-bf09-22aff4a45738\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458797 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2p2v\" (UniqueName: \"kubernetes.io/projected/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-kube-api-access-z2p2v\") pod \"collect-profiles-29547765-2wh77\" (UID: \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458834 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bffb6535-e060-4328-8e0d-0b8bd64c656b-stats-auth\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458909 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-w9tck\" (UID: \"0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.458965 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/baf7bbe6-5859-4df3-9164-a62bb2333078-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7x6zc\" (UID: \"baf7bbe6-5859-4df3-9164-a62bb2333078\") " pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.459004 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a-trusted-ca\") pod \"ingress-operator-5b745b69d9-mg5ld\" (UID: \"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.459027 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bffb6535-e060-4328-8e0d-0b8bd64c656b-service-ca-bundle\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.459063 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-audit-dir\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.459081 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/007a3578-3ca3-4979-aa06-b2ffe3c7718e-srv-cert\") pod \"olm-operator-6b444d44fb-75pn8\" (UID: \"007a3578-3ca3-4979-aa06-b2ffe3c7718e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.459108 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s7jf\" (UniqueName: \"kubernetes.io/projected/0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8-kube-api-access-9s7jf\") pod \"kube-storage-version-migrator-operator-b67b599dd-w9tck\" (UID: \"0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.459138 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mg5ld\" (UID: \"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.459162 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdca2db2-c710-4685-9033-fbbe40f73076-config\") pod \"route-controller-manager-6576b87f9c-zm7wn\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.459182 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/743d3e91-514c-4323-9fce-9d29d4d6e816-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2kb4\" (UID: \"743d3e91-514c-4323-9fce-9d29d4d6e816\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.459208 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9zws\" (UniqueName: \"kubernetes.io/projected/a77c6084-94de-4ebc-9a75-a83efa28b094-kube-api-access-r9zws\") pod \"downloads-7954f5f757-5tr44\" (UID: \"a77c6084-94de-4ebc-9a75-a83efa28b094\") " pod="openshift-console/downloads-7954f5f757-5tr44" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.460016 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f01d22-70a1-4a37-b942-722fd80ed583-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8qnmz\" (UID: \"90f01d22-70a1-4a37-b942-722fd80ed583\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.460094 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-config-volume\") pod \"collect-profiles-29547765-2wh77\" (UID: \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.460532 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/915093d7-a9dd-4f42-8b79-c887eaf983fe-srv-cert\") pod \"catalog-operator-68c6474976-7vzg6\" (UID: \"915093d7-a9dd-4f42-8b79-c887eaf983fe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.460744 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/45977497-a17d-40f6-bf09-22aff4a45738-images\") pod \"machine-config-operator-74547568cd-csdsw\" (UID: \"45977497-a17d-40f6-bf09-22aff4a45738\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.460936 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-audit-dir\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.461062 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.461111 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a-metrics-tls\") pod \"ingress-operator-5b745b69d9-mg5ld\" (UID: \"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.461137 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2jtx\" (UniqueName: \"kubernetes.io/projected/fe2bbec6-3204-4ad3-bbcc-bdc27060c44b-kube-api-access-j2jtx\") pod \"machine-config-server-skhtt\" (UID: \"fe2bbec6-3204-4ad3-bbcc-bdc27060c44b\") " pod="openshift-machine-config-operator/machine-config-server-skhtt" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.461170 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-socket-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.461191 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-plugins-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.461221 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2gdh\" (UniqueName: \"kubernetes.io/projected/dc64ec5b-3c7b-47de-8a6f-6c2555ba6465-kube-api-access-f2gdh\") pod \"package-server-manager-789f6589d5-p666q\" (UID: \"dc64ec5b-3c7b-47de-8a6f-6c2555ba6465\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.462037 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/915093d7-a9dd-4f42-8b79-c887eaf983fe-profile-collector-cert\") pod \"catalog-operator-68c6474976-7vzg6\" (UID: \"915093d7-a9dd-4f42-8b79-c887eaf983fe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.462161 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/baf7bbe6-5859-4df3-9164-a62bb2333078-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7x6zc\" (UID: \"baf7bbe6-5859-4df3-9164-a62bb2333078\") " pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.462668 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/07b5dbbc-b835-437d-bc58-1f4dc88e48bc-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rxqwq\" (UID: \"07b5dbbc-b835-437d-bc58-1f4dc88e48bc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.463199 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-registry-tls\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.463319 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvn5m\" (UniqueName: \"kubernetes.io/projected/0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a-kube-api-access-cvn5m\") pod \"ingress-operator-5b745b69d9-mg5ld\" (UID: \"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.463478 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0cf41cff-9af9-423f-8e57-117983f90b7b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.464047 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-registration-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.464191 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5p4h\" (UniqueName: \"kubernetes.io/projected/b6ad282e-3374-4a34-8956-252c6196274d-kube-api-access-x5p4h\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.464570 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17-config-volume\") pod \"dns-default-m9vbd\" (UID: \"25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17\") " pod="openshift-dns/dns-default-m9vbd" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.466739 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-w9tck\" (UID: \"0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.468301 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-encryption-config\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.468743 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-serving-cert\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.469363 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.469659 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0cf41cff-9af9-423f-8e57-117983f90b7b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.482979 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/45977497-a17d-40f6-bf09-22aff4a45738-proxy-tls\") pod \"machine-config-operator-74547568cd-csdsw\" (UID: \"45977497-a17d-40f6-bf09-22aff4a45738\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.493079 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmqtf\" (UniqueName: \"kubernetes.io/projected/915093d7-a9dd-4f42-8b79-c887eaf983fe-kube-api-access-jmqtf\") pod \"catalog-operator-68c6474976-7vzg6\" (UID: \"915093d7-a9dd-4f42-8b79-c887eaf983fe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.508036 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvcxz\" (UniqueName: \"kubernetes.io/projected/90f01d22-70a1-4a37-b942-722fd80ed583-kube-api-access-cvcxz\") pod \"openshift-controller-manager-operator-756b6f6bc6-8qnmz\" (UID: \"90f01d22-70a1-4a37-b942-722fd80ed583\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.517925 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw4wf\" (UniqueName: \"kubernetes.io/projected/b0b8ae0c-7ebc-4db9-a622-ab41e63e0690-kube-api-access-qw4wf\") pod \"multus-admission-controller-857f4d67dd-cb2gw\" (UID: \"b0b8ae0c-7ebc-4db9-a622-ab41e63e0690\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cb2gw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.536354 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.545253 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.551540 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87cwj\" (UniqueName: \"kubernetes.io/projected/45977497-a17d-40f6-bf09-22aff4a45738-kube-api-access-87cwj\") pod \"machine-config-operator-74547568cd-csdsw\" (UID: \"45977497-a17d-40f6-bf09-22aff4a45738\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.559429 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-bound-sa-token\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.574851 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-mountpoint-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.574923 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb104051-0f66-4faf-95f5-a7a28d41cdc2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s8sj2\" (UID: \"fb104051-0f66-4faf-95f5-a7a28d41cdc2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.574946 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bffb6535-e060-4328-8e0d-0b8bd64c656b-default-certificate\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.574982 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdhnl\" (UniqueName: \"kubernetes.io/projected/007a3578-3ca3-4979-aa06-b2ffe3c7718e-kube-api-access-gdhnl\") pod \"olm-operator-6b444d44fb-75pn8\" (UID: \"007a3578-3ca3-4979-aa06-b2ffe3c7718e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.575006 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l9gg\" (UniqueName: \"kubernetes.io/projected/baf7bbe6-5859-4df3-9164-a62bb2333078-kube-api-access-9l9gg\") pod \"marketplace-operator-79b997595-7x6zc\" (UID: \"baf7bbe6-5859-4df3-9164-a62bb2333078\") " pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.575053 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/007a3578-3ca3-4979-aa06-b2ffe3c7718e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-75pn8\" (UID: \"007a3578-3ca3-4979-aa06-b2ffe3c7718e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.575084 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc1105d7-08aa-4d81-9bb1-e162f649529d-cert\") pod \"ingress-canary-8lsvm\" (UID: \"dc1105d7-08aa-4d81-9bb1-e162f649529d\") " pod="openshift-ingress-canary/ingress-canary-8lsvm" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.575123 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-mountpoint-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.575842 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dszkg\" (UniqueName: \"kubernetes.io/projected/dc1105d7-08aa-4d81-9bb1-e162f649529d-kube-api-access-dszkg\") pod \"ingress-canary-8lsvm\" (UID: \"dc1105d7-08aa-4d81-9bb1-e162f649529d\") " pod="openshift-ingress-canary/ingress-canary-8lsvm" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.575874 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmqb5\" (UniqueName: \"kubernetes.io/projected/fdca2db2-c710-4685-9033-fbbe40f73076-kube-api-access-pmqb5\") pod \"route-controller-manager-6576b87f9c-zm7wn\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.575919 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-csi-data-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576428 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-csi-data-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576615 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0-config\") pod \"service-ca-operator-777779d784-29stc\" (UID: \"5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576640 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb104051-0f66-4faf-95f5-a7a28d41cdc2-config\") pod \"kube-apiserver-operator-766d6c64bb-s8sj2\" (UID: \"fb104051-0f66-4faf-95f5-a7a28d41cdc2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576682 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/743d3e91-514c-4323-9fce-9d29d4d6e816-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2kb4\" (UID: \"743d3e91-514c-4323-9fce-9d29d4d6e816\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576705 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6cedfec9-da7b-4080-894f-dbc3b3dbad46-signing-key\") pod \"service-ca-9c57cc56f-8tds9\" (UID: \"6cedfec9-da7b-4080-894f-dbc3b3dbad46\") " pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576723 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxhjd\" (UniqueName: \"kubernetes.io/projected/7566a1ff-1f6b-4e99-903b-ff036f98c411-kube-api-access-sxhjd\") pod \"auto-csr-approver-29547774-2b2fh\" (UID: \"7566a1ff-1f6b-4e99-903b-ff036f98c411\") " pod="openshift-infra/auto-csr-approver-29547774-2b2fh" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576759 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjbml\" (UniqueName: \"kubernetes.io/projected/07b5dbbc-b835-437d-bc58-1f4dc88e48bc-kube-api-access-gjbml\") pod \"machine-config-controller-84d6567774-rxqwq\" (UID: \"07b5dbbc-b835-437d-bc58-1f4dc88e48bc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576778 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flt9l\" (UniqueName: \"kubernetes.io/projected/5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0-kube-api-access-flt9l\") pod \"service-ca-operator-777779d784-29stc\" (UID: \"5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576798 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4s28\" (UniqueName: \"kubernetes.io/projected/25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17-kube-api-access-j4s28\") pod \"dns-default-m9vbd\" (UID: \"25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17\") " pod="openshift-dns/dns-default-m9vbd" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576845 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdca2db2-c710-4685-9033-fbbe40f73076-serving-cert\") pod \"route-controller-manager-6576b87f9c-zm7wn\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576866 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb104051-0f66-4faf-95f5-a7a28d41cdc2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s8sj2\" (UID: \"fb104051-0f66-4faf-95f5-a7a28d41cdc2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576883 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc64ec5b-3c7b-47de-8a6f-6c2555ba6465-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p666q\" (UID: \"dc64ec5b-3c7b-47de-8a6f-6c2555ba6465\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576923 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bffb6535-e060-4328-8e0d-0b8bd64c656b-metrics-certs\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576945 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdca2db2-c710-4685-9033-fbbe40f73076-client-ca\") pod \"route-controller-manager-6576b87f9c-zm7wn\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.576963 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fe2bbec6-3204-4ad3-bbcc-bdc27060c44b-node-bootstrap-token\") pod \"machine-config-server-skhtt\" (UID: \"fe2bbec6-3204-4ad3-bbcc-bdc27060c44b\") " pod="openshift-machine-config-operator/machine-config-server-skhtt" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577001 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhsnj\" (UniqueName: \"kubernetes.io/projected/bffb6535-e060-4328-8e0d-0b8bd64c656b-kube-api-access-rhsnj\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577016 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fe2bbec6-3204-4ad3-bbcc-bdc27060c44b-certs\") pod \"machine-config-server-skhtt\" (UID: \"fe2bbec6-3204-4ad3-bbcc-bdc27060c44b\") " pod="openshift-machine-config-operator/machine-config-server-skhtt" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577170 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bffb6535-e060-4328-8e0d-0b8bd64c656b-stats-auth\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577196 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/baf7bbe6-5859-4df3-9164-a62bb2333078-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7x6zc\" (UID: \"baf7bbe6-5859-4df3-9164-a62bb2333078\") " pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577223 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a-trusted-ca\") pod \"ingress-operator-5b745b69d9-mg5ld\" (UID: \"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577263 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bffb6535-e060-4328-8e0d-0b8bd64c656b-service-ca-bundle\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577284 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/007a3578-3ca3-4979-aa06-b2ffe3c7718e-srv-cert\") pod \"olm-operator-6b444d44fb-75pn8\" (UID: \"007a3578-3ca3-4979-aa06-b2ffe3c7718e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577329 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mg5ld\" (UID: \"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577351 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdca2db2-c710-4685-9033-fbbe40f73076-config\") pod \"route-controller-manager-6576b87f9c-zm7wn\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577369 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/743d3e91-514c-4323-9fce-9d29d4d6e816-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2kb4\" (UID: \"743d3e91-514c-4323-9fce-9d29d4d6e816\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577472 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a-metrics-tls\") pod \"ingress-operator-5b745b69d9-mg5ld\" (UID: \"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577493 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2jtx\" (UniqueName: \"kubernetes.io/projected/fe2bbec6-3204-4ad3-bbcc-bdc27060c44b-kube-api-access-j2jtx\") pod \"machine-config-server-skhtt\" (UID: \"fe2bbec6-3204-4ad3-bbcc-bdc27060c44b\") " pod="openshift-machine-config-operator/machine-config-server-skhtt" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577511 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-socket-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577552 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-plugins-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577571 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2gdh\" (UniqueName: \"kubernetes.io/projected/dc64ec5b-3c7b-47de-8a6f-6c2555ba6465-kube-api-access-f2gdh\") pod \"package-server-manager-789f6589d5-p666q\" (UID: \"dc64ec5b-3c7b-47de-8a6f-6c2555ba6465\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577622 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/baf7bbe6-5859-4df3-9164-a62bb2333078-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7x6zc\" (UID: \"baf7bbe6-5859-4df3-9164-a62bb2333078\") " pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577643 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/07b5dbbc-b835-437d-bc58-1f4dc88e48bc-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rxqwq\" (UID: \"07b5dbbc-b835-437d-bc58-1f4dc88e48bc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577663 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvn5m\" (UniqueName: \"kubernetes.io/projected/0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a-kube-api-access-cvn5m\") pod \"ingress-operator-5b745b69d9-mg5ld\" (UID: \"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577711 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-registration-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577729 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5p4h\" (UniqueName: \"kubernetes.io/projected/b6ad282e-3374-4a34-8956-252c6196274d-kube-api-access-x5p4h\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577749 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17-config-volume\") pod \"dns-default-m9vbd\" (UID: \"25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17\") " pod="openshift-dns/dns-default-m9vbd" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577800 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577822 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0-serving-cert\") pod \"service-ca-operator-777779d784-29stc\" (UID: \"5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577864 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17-metrics-tls\") pod \"dns-default-m9vbd\" (UID: \"25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17\") " pod="openshift-dns/dns-default-m9vbd" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577885 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/743d3e91-514c-4323-9fce-9d29d4d6e816-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2kb4\" (UID: \"743d3e91-514c-4323-9fce-9d29d4d6e816\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.577903 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6cedfec9-da7b-4080-894f-dbc3b3dbad46-signing-cabundle\") pod \"service-ca-9c57cc56f-8tds9\" (UID: \"6cedfec9-da7b-4080-894f-dbc3b3dbad46\") " pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.578342 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6vcj\" (UniqueName: \"kubernetes.io/projected/6cedfec9-da7b-4080-894f-dbc3b3dbad46-kube-api-access-g6vcj\") pod \"service-ca-9c57cc56f-8tds9\" (UID: \"6cedfec9-da7b-4080-894f-dbc3b3dbad46\") " pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.578388 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07b5dbbc-b835-437d-bc58-1f4dc88e48bc-proxy-tls\") pod \"machine-config-controller-84d6567774-rxqwq\" (UID: \"07b5dbbc-b835-437d-bc58-1f4dc88e48bc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.578550 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0-config\") pod \"service-ca-operator-777779d784-29stc\" (UID: \"5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.578882 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a-trusted-ca\") pod \"ingress-operator-5b745b69d9-mg5ld\" (UID: \"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.581150 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdca2db2-c710-4685-9033-fbbe40f73076-client-ca\") pod \"route-controller-manager-6576b87f9c-zm7wn\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:10 crc kubenswrapper[4941]: E0307 06:55:10.581646 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:11.081626817 +0000 UTC m=+208.033992282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.583050 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/baf7bbe6-5859-4df3-9164-a62bb2333078-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7x6zc\" (UID: \"baf7bbe6-5859-4df3-9164-a62bb2333078\") " pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.584858 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb104051-0f66-4faf-95f5-a7a28d41cdc2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s8sj2\" (UID: \"fb104051-0f66-4faf-95f5-a7a28d41cdc2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.604865 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0-serving-cert\") pod \"service-ca-operator-777779d784-29stc\" (UID: \"5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.585393 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-registration-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.586223 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17-config-volume\") pod \"dns-default-m9vbd\" (UID: \"25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17\") " pod="openshift-dns/dns-default-m9vbd" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.588245 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bffb6535-e060-4328-8e0d-0b8bd64c656b-default-certificate\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.589456 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc1105d7-08aa-4d81-9bb1-e162f649529d-cert\") pod \"ingress-canary-8lsvm\" (UID: \"dc1105d7-08aa-4d81-9bb1-e162f649529d\") " pod="openshift-ingress-canary/ingress-canary-8lsvm" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.589608 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-socket-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.591064 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/743d3e91-514c-4323-9fce-9d29d4d6e816-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2kb4\" (UID: \"743d3e91-514c-4323-9fce-9d29d4d6e816\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.592236 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/007a3578-3ca3-4979-aa06-b2ffe3c7718e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-75pn8\" (UID: \"007a3578-3ca3-4979-aa06-b2ffe3c7718e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.592317 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bffb6535-e060-4328-8e0d-0b8bd64c656b-service-ca-bundle\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.593083 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b6ad282e-3374-4a34-8956-252c6196274d-plugins-dir\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.595537 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/743d3e91-514c-4323-9fce-9d29d4d6e816-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2kb4\" (UID: \"743d3e91-514c-4323-9fce-9d29d4d6e816\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.595819 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/007a3578-3ca3-4979-aa06-b2ffe3c7718e-srv-cert\") pod \"olm-operator-6b444d44fb-75pn8\" (UID: \"007a3578-3ca3-4979-aa06-b2ffe3c7718e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.595993 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdca2db2-c710-4685-9033-fbbe40f73076-config\") pod \"route-controller-manager-6576b87f9c-zm7wn\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.596645 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bffb6535-e060-4328-8e0d-0b8bd64c656b-stats-auth\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.597194 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fe2bbec6-3204-4ad3-bbcc-bdc27060c44b-certs\") pod \"machine-config-server-skhtt\" (UID: \"fe2bbec6-3204-4ad3-bbcc-bdc27060c44b\") " pod="openshift-machine-config-operator/machine-config-server-skhtt" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.597198 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc64ec5b-3c7b-47de-8a6f-6c2555ba6465-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p666q\" (UID: \"dc64ec5b-3c7b-47de-8a6f-6c2555ba6465\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.597273 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdca2db2-c710-4685-9033-fbbe40f73076-serving-cert\") pod \"route-controller-manager-6576b87f9c-zm7wn\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.597287 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/baf7bbe6-5859-4df3-9164-a62bb2333078-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7x6zc\" (UID: \"baf7bbe6-5859-4df3-9164-a62bb2333078\") " pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.597730 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17-metrics-tls\") pod \"dns-default-m9vbd\" (UID: \"25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17\") " pod="openshift-dns/dns-default-m9vbd" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.600990 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb104051-0f66-4faf-95f5-a7a28d41cdc2-config\") pod \"kube-apiserver-operator-766d6c64bb-s8sj2\" (UID: \"fb104051-0f66-4faf-95f5-a7a28d41cdc2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.601292 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07b5dbbc-b835-437d-bc58-1f4dc88e48bc-proxy-tls\") pod \"machine-config-controller-84d6567774-rxqwq\" (UID: \"07b5dbbc-b835-437d-bc58-1f4dc88e48bc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.604347 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cb2gw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.584946 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/07b5dbbc-b835-437d-bc58-1f4dc88e48bc-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rxqwq\" (UID: \"07b5dbbc-b835-437d-bc58-1f4dc88e48bc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.605239 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a-metrics-tls\") pod \"ingress-operator-5b745b69d9-mg5ld\" (UID: \"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.606610 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6cedfec9-da7b-4080-894f-dbc3b3dbad46-signing-key\") pod \"service-ca-9c57cc56f-8tds9\" (UID: \"6cedfec9-da7b-4080-894f-dbc3b3dbad46\") " pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.607396 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fe2bbec6-3204-4ad3-bbcc-bdc27060c44b-node-bootstrap-token\") pod \"machine-config-server-skhtt\" (UID: \"fe2bbec6-3204-4ad3-bbcc-bdc27060c44b\") " pod="openshift-machine-config-operator/machine-config-server-skhtt" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.609877 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bffb6535-e060-4328-8e0d-0b8bd64c656b-metrics-certs\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.612192 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6cedfec9-da7b-4080-894f-dbc3b3dbad46-signing-cabundle\") pod \"service-ca-9c57cc56f-8tds9\" (UID: \"6cedfec9-da7b-4080-894f-dbc3b3dbad46\") " pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.615496 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9zws\" (UniqueName: \"kubernetes.io/projected/a77c6084-94de-4ebc-9a75-a83efa28b094-kube-api-access-r9zws\") pod \"downloads-7954f5f757-5tr44\" (UID: \"a77c6084-94de-4ebc-9a75-a83efa28b094\") " pod="openshift-console/downloads-7954f5f757-5tr44" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.615637 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnml\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-kube-api-access-rvnml\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.629523 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s7jf\" (UniqueName: \"kubernetes.io/projected/0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8-kube-api-access-9s7jf\") pod \"kube-storage-version-migrator-operator-b67b599dd-w9tck\" (UID: \"0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.642486 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2p2v\" (UniqueName: \"kubernetes.io/projected/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-kube-api-access-z2p2v\") pod \"collect-profiles-29547765-2wh77\" (UID: \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.670365 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v76pm"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.672559 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-x2rhs"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.674445 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-j2bnz"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.679033 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:10 crc kubenswrapper[4941]: E0307 06:55:10.679654 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:11.179631696 +0000 UTC m=+208.131997161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.684957 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmrkj\" (UniqueName: \"kubernetes.io/projected/1f1180d1-6136-4fa5-9c10-7c4a25d4dff8-kube-api-access-pmrkj\") pod \"apiserver-7bbb656c7d-8mfp9\" (UID: \"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.709877 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdhnl\" (UniqueName: \"kubernetes.io/projected/007a3578-3ca3-4979-aa06-b2ffe3c7718e-kube-api-access-gdhnl\") pod \"olm-operator-6b444d44fb-75pn8\" (UID: \"007a3578-3ca3-4979-aa06-b2ffe3c7718e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.734015 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dszkg\" (UniqueName: \"kubernetes.io/projected/dc1105d7-08aa-4d81-9bb1-e162f649529d-kube-api-access-dszkg\") pod \"ingress-canary-8lsvm\" (UID: \"dc1105d7-08aa-4d81-9bb1-e162f649529d\") " pod="openshift-ingress-canary/ingress-canary-8lsvm" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.763137 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l9gg\" (UniqueName: \"kubernetes.io/projected/baf7bbe6-5859-4df3-9164-a62bb2333078-kube-api-access-9l9gg\") pod \"marketplace-operator-79b997595-7x6zc\" (UID: \"baf7bbe6-5859-4df3-9164-a62bb2333078\") " pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.763483 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8lsvm" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.766554 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmqb5\" (UniqueName: \"kubernetes.io/projected/fdca2db2-c710-4685-9033-fbbe40f73076-kube-api-access-pmqb5\") pod \"route-controller-manager-6576b87f9c-zm7wn\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.780416 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: E0307 06:55:10.780809 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:11.280794724 +0000 UTC m=+208.233160189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.783132 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhsnj\" (UniqueName: \"kubernetes.io/projected/bffb6535-e060-4328-8e0d-0b8bd64c656b-kube-api-access-rhsnj\") pod \"router-default-5444994796-w745p\" (UID: \"bffb6535-e060-4328-8e0d-0b8bd64c656b\") " pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.797557 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5bq7n"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.798711 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8n7vr"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.800087 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nwzjs"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.800802 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.808386 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb104051-0f66-4faf-95f5-a7a28d41cdc2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s8sj2\" (UID: \"fb104051-0f66-4faf-95f5-a7a28d41cdc2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.814604 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxhjd\" (UniqueName: \"kubernetes.io/projected/7566a1ff-1f6b-4e99-903b-ff036f98c411-kube-api-access-sxhjd\") pod \"auto-csr-approver-29547774-2b2fh\" (UID: \"7566a1ff-1f6b-4e99-903b-ff036f98c411\") " pod="openshift-infra/auto-csr-approver-29547774-2b2fh" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.823883 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.837013 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/743d3e91-514c-4323-9fce-9d29d4d6e816-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2kb4\" (UID: \"743d3e91-514c-4323-9fce-9d29d4d6e816\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.868374 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5tr44" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.869543 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjbml\" (UniqueName: \"kubernetes.io/projected/07b5dbbc-b835-437d-bc58-1f4dc88e48bc-kube-api-access-gjbml\") pod \"machine-config-controller-84d6567774-rxqwq\" (UID: \"07b5dbbc-b835-437d-bc58-1f4dc88e48bc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.870236 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.881016 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:10 crc kubenswrapper[4941]: E0307 06:55:10.881713 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:11.381695134 +0000 UTC m=+208.334060599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.888071 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flt9l\" (UniqueName: \"kubernetes.io/projected/5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0-kube-api-access-flt9l\") pod \"service-ca-operator-777779d784-29stc\" (UID: \"5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.895272 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.908853 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.911847 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4s28\" (UniqueName: \"kubernetes.io/projected/25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17-kube-api-access-j4s28\") pod \"dns-default-m9vbd\" (UID: \"25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17\") " pod="openshift-dns/dns-default-m9vbd" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.914303 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.921127 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvn5m\" (UniqueName: \"kubernetes.io/projected/0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a-kube-api-access-cvn5m\") pod \"ingress-operator-5b745b69d9-mg5ld\" (UID: \"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:10 crc kubenswrapper[4941]: W0307 06:55:10.921163 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod451b0ac4_065b_46e3_813e_7b62a311e7eb.slice/crio-2495a6ea92b4079e7177dc04bb24c99b74599e82cb53921969909706c70e068a WatchSource:0}: Error finding container 2495a6ea92b4079e7177dc04bb24c99b74599e82cb53921969909706c70e068a: Status 404 returned error can't find the container with id 2495a6ea92b4079e7177dc04bb24c99b74599e82cb53921969909706c70e068a Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.924217 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.933097 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.935517 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5p4h\" (UniqueName: \"kubernetes.io/projected/b6ad282e-3374-4a34-8956-252c6196274d-kube-api-access-x5p4h\") pod \"csi-hostpathplugin-vsntc\" (UID: \"b6ad282e-3374-4a34-8956-252c6196274d\") " pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.939041 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.954309 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.969998 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.976508 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6vcj\" (UniqueName: \"kubernetes.io/projected/6cedfec9-da7b-4080-894f-dbc3b3dbad46-kube-api-access-g6vcj\") pod \"service-ca-9c57cc56f-8tds9\" (UID: \"6cedfec9-da7b-4080-894f-dbc3b3dbad46\") " pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.980822 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hvmvs"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.982239 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:55:10 crc kubenswrapper[4941]: E0307 06:55:10.982562 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:11.482549633 +0000 UTC m=+208.434915098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.982239 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.985681 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2jtx\" (UniqueName: \"kubernetes.io/projected/fe2bbec6-3204-4ad3-bbcc-bdc27060c44b-kube-api-access-j2jtx\") pod \"machine-config-server-skhtt\" (UID: \"fe2bbec6-3204-4ad3-bbcc-bdc27060c44b\") " pod="openshift-machine-config-operator/machine-config-server-skhtt" Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.999206 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v"] Mar 07 06:55:10 crc kubenswrapper[4941]: I0307 06:55:10.999881 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.001081 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.004493 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd"] Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.004525 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mg5ld\" (UID: \"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.019212 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2gdh\" (UniqueName: \"kubernetes.io/projected/dc64ec5b-3c7b-47de-8a6f-6c2555ba6465-kube-api-access-f2gdh\") pod \"package-server-manager-789f6589d5-p666q\" (UID: \"dc64ec5b-3c7b-47de-8a6f-6c2555ba6465\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.025113 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547774-2b2fh" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.032621 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:11 crc kubenswrapper[4941]: W0307 06:55:11.036682 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f291eed_0e60_43a7_a34a_2f7ad9788126.slice/crio-d3a95ff86a4b1e53a1786f457c1f95dc47516315abf7e8e981d9b94c09cadcf8 WatchSource:0}: Error finding container d3a95ff86a4b1e53a1786f457c1f95dc47516315abf7e8e981d9b94c09cadcf8: Status 404 returned error can't find the container with id d3a95ff86a4b1e53a1786f457c1f95dc47516315abf7e8e981d9b94c09cadcf8 Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.038199 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" Mar 07 06:55:11 crc kubenswrapper[4941]: W0307 06:55:11.042528 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37fe4f8d_b1a8_4848_805b_f44095a2daeb.slice/crio-7fd1a03c244767ab3d014d553f7990cb33d16d9db703d7212d75890125f185f7 WatchSource:0}: Error finding container 7fd1a03c244767ab3d014d553f7990cb33d16d9db703d7212d75890125f185f7: Status 404 returned error can't find the container with id 7fd1a03c244767ab3d014d553f7990cb33d16d9db703d7212d75890125f185f7 Mar 07 06:55:11 crc kubenswrapper[4941]: W0307 06:55:11.047238 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda004c43d_7acc_4a7e_afc1_947c31df55ad.slice/crio-a709fd222f1027cd6eefa86bb89be1b210a5839bc1b86a870b509c5902f327ed WatchSource:0}: Error finding container a709fd222f1027cd6eefa86bb89be1b210a5839bc1b86a870b509c5902f327ed: Status 404 returned error can't find the container with id a709fd222f1027cd6eefa86bb89be1b210a5839bc1b86a870b509c5902f327ed Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.053899 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.070968 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" event={"ID":"90f01d22-70a1-4a37-b942-722fd80ed583","Type":"ContainerStarted","Data":"3ea21122328eef95096d504f8a82816e7c3ed1f5c7ff2471df897960eaebd171"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.070990 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-skhtt" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.072942 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" event={"ID":"37fe4f8d-b1a8-4848-805b-f44095a2daeb","Type":"ContainerStarted","Data":"7fd1a03c244767ab3d014d553f7990cb33d16d9db703d7212d75890125f185f7"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.075969 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" event={"ID":"755ab720-3d7a-4d86-8613-147ac07a93dd","Type":"ContainerStarted","Data":"5f4b5ebf79abadff63f829b277d2bddc3006c991a16ef47842624e3512111f09"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.078521 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m9vbd" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.083296 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.083384 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" event={"ID":"e25380ea-ea92-42bb-bd73-4da399dc0cc4","Type":"ContainerStarted","Data":"149cc858de59ca5686ed9fa67428393ed6a566b71815fc6262defea25deafcbe"} Mar 07 06:55:11 crc kubenswrapper[4941]: E0307 06:55:11.083573 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:11.583553626 +0000 UTC m=+208.535919091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.084983 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vsntc" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.107533 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" event={"ID":"b14f5741-27aa-4ff2-a2dc-69c385a07e16","Type":"ContainerStarted","Data":"3b1d00613be0897141cec0486eed86e6de9a10614a3184b68a1a1c14176040dd"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.107580 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" event={"ID":"b14f5741-27aa-4ff2-a2dc-69c385a07e16","Type":"ContainerStarted","Data":"9980461d589e6184b2d9a377383ff3e8a782981930ec5b570f74afb9291d37c2"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.112153 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" event={"ID":"915093d7-a9dd-4f42-8b79-c887eaf983fe","Type":"ContainerStarted","Data":"bc7b9c5d3dd938e9362282dcb498e8ef017d016a053e38a37b19a94a8cb46884"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.114536 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nwzjs" event={"ID":"46da50cb-1038-4289-be6d-e5f3b4c70ab3","Type":"ContainerStarted","Data":"5dab62067ec5636ab980e7b59d12f6cb4858ac3410a31593e4ac1320d927873d"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.121019 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" event={"ID":"a7c60c91-094d-4c52-9dcb-36ad07c829ad","Type":"ContainerStarted","Data":"9e8591d2b145a2f6fe3fb26186310079b1f27e5d9e3549f06d1e8f5fb20bd285"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.136648 4941 generic.go:334] "Generic (PLEG): container finished" podID="48b6255a-3390-4e84-bed2-6a28fc0c9800" containerID="80562d33327120521ad01980c702da6d38447cf02ff158fcb70ecceaa93dbbc2" exitCode=0 Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.137145 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" event={"ID":"48b6255a-3390-4e84-bed2-6a28fc0c9800","Type":"ContainerDied","Data":"80562d33327120521ad01980c702da6d38447cf02ff158fcb70ecceaa93dbbc2"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.141181 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ntlhm" event={"ID":"44081086-ee5b-4b26-8af9-f35aa03402fc","Type":"ContainerStarted","Data":"07aa5be748ea003575cfb44ed514bb3a7029b4f458eb8d3e179e34b1373c582e"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.141235 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ntlhm" event={"ID":"44081086-ee5b-4b26-8af9-f35aa03402fc","Type":"ContainerStarted","Data":"08e834843d0702f0bcae3f21d3a60c679c7d6adb8d3b6e96787f8cc44f581a85"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.141250 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ntlhm" event={"ID":"44081086-ee5b-4b26-8af9-f35aa03402fc","Type":"ContainerStarted","Data":"928f2c51734b774ffb75a9d7df22cad4fd0c03a1d28407bfa5197ffae3d92a41"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.145758 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ljm6r" event={"ID":"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b","Type":"ContainerStarted","Data":"b9a35319bb82573e8e7d8bc0fc5a56a68790cd68f11650450ba1312b1c7b6a5d"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.145806 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ljm6r" event={"ID":"79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b","Type":"ContainerStarted","Data":"9dd714c79efe1c442ac3122f44fcccfde2f293a2c17eeaefad97af796d4be80b"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.146362 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.150800 4941 patch_prober.go:28] interesting pod/console-operator-58897d9998-ljm6r container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.150859 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ljm6r" podUID="79479b8e-7b8f-4fb2-8b73-5f75d6ad0f3b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.153391 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" event={"ID":"e51d2d8f-21d6-41a5-a10c-caf3fa669e76","Type":"ContainerStarted","Data":"4caa65d8a40ef39cce7c4eefc0e16e4222715ed94558eff1bbb89331eae81781"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.153483 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" event={"ID":"e51d2d8f-21d6-41a5-a10c-caf3fa669e76","Type":"ContainerStarted","Data":"e3669444e38efd11b4d7e0c446a5da7e8a9ed20d80ea29bed118fa8ade16ea68"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.156434 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" event={"ID":"bb8c0212-2a6d-4636-a75b-08a350f5948f","Type":"ContainerStarted","Data":"0fa92f36965ccbe86d8063d51146e42efa08635dd7940af9b0e68971e9995679"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.157211 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-x2rhs" event={"ID":"f2470875-024a-4ef0-9e01-20bbbfff60bc","Type":"ContainerStarted","Data":"b9a5eca8a55c69f5c65201b8e8090757d3450916455edf2aed0c55f37e82caf3"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.157924 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" event={"ID":"3f291eed-0e60-43a7-a34a-2f7ad9788126","Type":"ContainerStarted","Data":"d3a95ff86a4b1e53a1786f457c1f95dc47516315abf7e8e981d9b94c09cadcf8"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.158646 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" event={"ID":"a5d1706b-179e-4ffd-a2af-e62d05e1e36d","Type":"ContainerStarted","Data":"7e46c614c590d753045e8acfe8f65d2c88c30fd3b2b02e315945056619a616e9"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.159228 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" event={"ID":"612ac789-5007-4e17-a81a-cf753c2acadc","Type":"ContainerStarted","Data":"f02bd535e53f3cac720e0f6f703bc17d02b580ca432a546c37b4c30530e2125a"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.159878 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v" event={"ID":"9beea563-7739-4b17-b360-bb769400bdff","Type":"ContainerStarted","Data":"cb2390cf3f86e76fba0ecc4710ae35b2cd7843d291019ae931cad4cd186faf5e"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.161740 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" event={"ID":"451b0ac4-065b-46e3-813e-7b62a311e7eb","Type":"ContainerStarted","Data":"2495a6ea92b4079e7177dc04bb24c99b74599e82cb53921969909706c70e068a"} Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.176135 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cb2gw"] Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.184778 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:11 crc kubenswrapper[4941]: E0307 06:55:11.188220 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:11.688206291 +0000 UTC m=+208.640571756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.195491 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw"] Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.224719 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8lsvm"] Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.246810 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.257472 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.275345 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.285509 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:11 crc kubenswrapper[4941]: E0307 06:55:11.285748 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:11.785710617 +0000 UTC m=+208.738076082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.285823 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:11 crc kubenswrapper[4941]: E0307 06:55:11.286570 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:11.78655943 +0000 UTC m=+208.738924895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:11 crc kubenswrapper[4941]: W0307 06:55:11.383620 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0b8ae0c_7ebc_4db9_a622_ab41e63e0690.slice/crio-bfcd7667f9530b9502135be40378e493f74b848d38d798a3036555388af75c38 WatchSource:0}: Error finding container bfcd7667f9530b9502135be40378e493f74b848d38d798a3036555388af75c38: Status 404 returned error can't find the container with id bfcd7667f9530b9502135be40378e493f74b848d38d798a3036555388af75c38 Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.386936 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:11 crc kubenswrapper[4941]: E0307 06:55:11.387109 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:11.887072029 +0000 UTC m=+208.839437494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.387295 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:11 crc kubenswrapper[4941]: E0307 06:55:11.387746 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:11.887732987 +0000 UTC m=+208.840098452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.489820 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:11 crc kubenswrapper[4941]: E0307 06:55:11.490785 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:11.990760767 +0000 UTC m=+208.943126232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:11 crc kubenswrapper[4941]: W0307 06:55:11.498595 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45977497_a17d_40f6_bf09_22aff4a45738.slice/crio-0041559d4f61f744047a19cbbbb51f34a2b2b1dcf469ca0e37dcf011bfb4d943 WatchSource:0}: Error finding container 0041559d4f61f744047a19cbbbb51f34a2b2b1dcf469ca0e37dcf011bfb4d943: Status 404 returned error can't find the container with id 0041559d4f61f744047a19cbbbb51f34a2b2b1dcf469ca0e37dcf011bfb4d943 Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.552945 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-29stc"] Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.591497 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:11 crc kubenswrapper[4941]: E0307 06:55:11.591827 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:12.091812421 +0000 UTC m=+209.044177876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.628944 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5tr44"] Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.631070 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8"] Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.678324 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77"] Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.698908 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:11 crc kubenswrapper[4941]: E0307 06:55:11.707208 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:12.207175484 +0000 UTC m=+209.159540949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.718895 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck"] Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.723236 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9"] Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.803801 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:11 crc kubenswrapper[4941]: E0307 06:55:11.804432 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:12.304419342 +0000 UTC m=+209.256784807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.907683 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:11 crc kubenswrapper[4941]: E0307 06:55:11.907857 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:12.407835723 +0000 UTC m=+209.360201198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:11 crc kubenswrapper[4941]: I0307 06:55:11.908074 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:11 crc kubenswrapper[4941]: E0307 06:55:11.908451 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:12.40844017 +0000 UTC m=+209.360805635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.018382 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:12 crc kubenswrapper[4941]: E0307 06:55:12.018809 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:12.518781833 +0000 UTC m=+209.471147298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.019353 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:12 crc kubenswrapper[4941]: E0307 06:55:12.019777 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:12.51977036 +0000 UTC m=+209.472135815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:12 crc kubenswrapper[4941]: W0307 06:55:12.069533 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bccd3b5_9ee1_4aa3_90e6_220c7a1a9cf8.slice/crio-f3e65b39ad050c863c0012a497b3f3828955a005ee948a966f7662b0c1480c3b WatchSource:0}: Error finding container f3e65b39ad050c863c0012a497b3f3828955a005ee948a966f7662b0c1480c3b: Status 404 returned error can't find the container with id f3e65b39ad050c863c0012a497b3f3828955a005ee948a966f7662b0c1480c3b Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.071972 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4"] Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.145182 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:12 crc kubenswrapper[4941]: E0307 06:55:12.145615 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:12.645593305 +0000 UTC m=+209.597958770 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:12 crc kubenswrapper[4941]: E0307 06:55:12.151490 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:12.651469538 +0000 UTC m=+209.603835003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.159266 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.204467 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m9vbd"] Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.242785 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" event={"ID":"0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8","Type":"ContainerStarted","Data":"f3e65b39ad050c863c0012a497b3f3828955a005ee948a966f7662b0c1480c3b"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.250671 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" event={"ID":"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8","Type":"ContainerStarted","Data":"c16e67713c7b9ad72d0e651279fcdebe1a58766ce006d96e70863cedf1097708"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.254391 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" event={"ID":"3f291eed-0e60-43a7-a34a-2f7ad9788126","Type":"ContainerStarted","Data":"99f27b555a8df0f91bc9ff69a0c3d4645990103c24d99d6dc536c53bee196359"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.255921 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cb2gw" event={"ID":"b0b8ae0c-7ebc-4db9-a622-ab41e63e0690","Type":"ContainerStarted","Data":"bfcd7667f9530b9502135be40378e493f74b848d38d798a3036555388af75c38"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.257246 4941 generic.go:334] "Generic (PLEG): container finished" podID="a5d1706b-179e-4ffd-a2af-e62d05e1e36d" containerID="1201e076e115d5087e416a6182877848b80fe984d9d627d627bbaac247e50e93" exitCode=0 Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.257307 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" event={"ID":"a5d1706b-179e-4ffd-a2af-e62d05e1e36d","Type":"ContainerDied","Data":"1201e076e115d5087e416a6182877848b80fe984d9d627d627bbaac247e50e93"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.259363 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" event={"ID":"451b0ac4-065b-46e3-813e-7b62a311e7eb","Type":"ContainerStarted","Data":"b53f4b4c7acc3a07c966e3a4c62a8aac951edf2681e5839a91d6c48739fb97ae"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.260065 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:12 crc kubenswrapper[4941]: E0307 06:55:12.261689 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:12.761670828 +0000 UTC m=+209.714036293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.269799 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" event={"ID":"a7c60c91-094d-4c52-9dcb-36ad07c829ad","Type":"ContainerStarted","Data":"9f83a686094a05894d28256d3e91da4d92d1c9f19dafe308e74fe8f139e6a9bb"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.287373 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ntlhm" podStartSLOduration=175.287345893 podStartE2EDuration="2m55.287345893s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:12.265845034 +0000 UTC m=+209.218210519" watchObservedRunningTime="2026-03-07 06:55:12.287345893 +0000 UTC m=+209.239711358" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.290277 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq"] Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.297990 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd" event={"ID":"a004c43d-7acc-4a7e-afc1-947c31df55ad","Type":"ContainerStarted","Data":"9f8f879c3a9458f4a112e1f28aebee94e9c8721ff6d429257883448bc2049fae"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.298055 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd" event={"ID":"a004c43d-7acc-4a7e-afc1-947c31df55ad","Type":"ContainerStarted","Data":"a709fd222f1027cd6eefa86bb89be1b210a5839bc1b86a870b509c5902f327ed"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.311905 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" event={"ID":"743d3e91-514c-4323-9fce-9d29d4d6e816","Type":"ContainerStarted","Data":"c0d2b5ff44d73d41a85ac2c7f685a45991084a05fab515062b0904bed7f2365c"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.313671 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" event={"ID":"5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0","Type":"ContainerStarted","Data":"3bbc642d50a1ab03de09b556166f770d5bb073470284a36a0d515e47a15b2222"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.318378 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v" event={"ID":"9beea563-7739-4b17-b360-bb769400bdff","Type":"ContainerStarted","Data":"5048b4f0236d86387e2e5460e22463f0d96cf0284ff1c4eba8a688199a56b6ea"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.322123 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" event={"ID":"90f01d22-70a1-4a37-b942-722fd80ed583","Type":"ContainerStarted","Data":"a375bebf333e063643d222c7e725f2ff75719a85488c0f94d6379ea8015ad6c5"} Mar 07 06:55:12 crc kubenswrapper[4941]: W0307 06:55:12.323421 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07b5dbbc_b835_437d_bc58_1f4dc88e48bc.slice/crio-a5e16d8415be3264ca09a3c93d0e969f6d0117ca8d637c416687513e1d7e572b WatchSource:0}: Error finding container a5e16d8415be3264ca09a3c93d0e969f6d0117ca8d637c416687513e1d7e572b: Status 404 returned error can't find the container with id a5e16d8415be3264ca09a3c93d0e969f6d0117ca8d637c416687513e1d7e572b Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.329091 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8lsvm" event={"ID":"dc1105d7-08aa-4d81-9bb1-e162f649529d","Type":"ContainerStarted","Data":"7467cfcfe07e2aeaa05722813fc471cd9c1987c46f1f31ddc44f4a37d3b6a2ad"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.331549 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5tr44" event={"ID":"a77c6084-94de-4ebc-9a75-a83efa28b094","Type":"ContainerStarted","Data":"1b4fe7ecdb8ce0977ccd40ed7afafa28e42447a83af812e18ace16efbd6a7e60"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.332088 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7x6zc"] Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.347073 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8tds9"] Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.352327 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nwzjs" event={"ID":"46da50cb-1038-4289-be6d-e5f3b4c70ab3","Type":"ContainerStarted","Data":"65671d5efd2e4ead55599ceefe91ff4b1d17a45a0827282d343bcab62af6dfec"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.362832 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:12 crc kubenswrapper[4941]: E0307 06:55:12.369522 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:12.869499331 +0000 UTC m=+209.821864796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.402723 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" event={"ID":"bb8c0212-2a6d-4636-a75b-08a350f5948f","Type":"ContainerStarted","Data":"c5b7d30f1a7540962614f7c1e8e985aadca3abca5d22e20cecf3923be1c81d26"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.403550 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.408629 4941 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-v76pm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.408686 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" podUID="bb8c0212-2a6d-4636-a75b-08a350f5948f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.421101 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" event={"ID":"915093d7-a9dd-4f42-8b79-c887eaf983fe","Type":"ContainerStarted","Data":"f9895a27d8fc29e3485efce0c4e4cbc5ff534e4b4dda228518b2b4c0b9ef10f8"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.421627 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.422616 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" event={"ID":"37fe4f8d-b1a8-4848-805b-f44095a2daeb","Type":"ContainerStarted","Data":"83ea547e1fefd9828f7346f9df82bcf9589ac0414c702f26e0572c56a42e9cb3"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.423810 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.431748 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-x2rhs" event={"ID":"f2470875-024a-4ef0-9e01-20bbbfff60bc","Type":"ContainerStarted","Data":"2be4e16ee478bd6d512f1abd2124be67fbf137858c3c1671946e1ccdea298d36"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.442072 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" event={"ID":"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb","Type":"ContainerStarted","Data":"79262c3b7c55439b088428bb281acb42da3a85dc921dfac56b12e0e350fe0b6a"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.445297 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q"] Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.465454 4941 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7vzg6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.465541 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" podUID="915093d7-a9dd-4f42-8b79-c887eaf983fe" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.465655 4941 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cp52v container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:5443/healthz\": dial tcp 10.217.0.15:5443: connect: connection refused" start-of-body= Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.465677 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" podUID="37fe4f8d-b1a8-4848-805b-f44095a2daeb" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.15:5443/healthz\": dial tcp 10.217.0.15:5443: connect: connection refused" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.467912 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:12 crc kubenswrapper[4941]: E0307 06:55:12.469710 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:12.969678201 +0000 UTC m=+209.922043816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.487660 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w745p" event={"ID":"bffb6535-e060-4328-8e0d-0b8bd64c656b","Type":"ContainerStarted","Data":"9b5e557d73ec3b7f053a2b4b26df3bcf56615789df0e8c31422f7385fdfc6893"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.488205 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w745p" event={"ID":"bffb6535-e060-4328-8e0d-0b8bd64c656b","Type":"ContainerStarted","Data":"ade2e5c6786c5aebd22e25eb82c2f36993b11f291a0ee198a53776af767d1b5c"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.500795 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vsntc"] Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.528281 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" event={"ID":"45977497-a17d-40f6-bf09-22aff4a45738","Type":"ContainerStarted","Data":"0041559d4f61f744047a19cbbbb51f34a2b2b1dcf469ca0e37dcf011bfb4d943"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.555017 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrc6" podStartSLOduration=175.554998257 podStartE2EDuration="2m55.554998257s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:12.552974481 +0000 UTC m=+209.505339956" watchObservedRunningTime="2026-03-07 06:55:12.554998257 +0000 UTC m=+209.507363722" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.560029 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-skhtt" event={"ID":"fe2bbec6-3204-4ad3-bbcc-bdc27060c44b","Type":"ContainerStarted","Data":"3a47b38b915c6862fb020141f1e16274272340f40cba1a271e60c66a4ad29ca0"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.571578 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:12 crc kubenswrapper[4941]: E0307 06:55:12.573184 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:13.073162643 +0000 UTC m=+210.025528328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.589356 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" event={"ID":"007a3578-3ca3-4979-aa06-b2ffe3c7718e","Type":"ContainerStarted","Data":"f9be58737990388306eb5de30b29ce989f5715fbba3390e6ff2a030f231aecc6"} Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.641096 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbbxx" podStartSLOduration=175.641070164 podStartE2EDuration="2m55.641070164s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:12.640376025 +0000 UTC m=+209.592741490" watchObservedRunningTime="2026-03-07 06:55:12.641070164 +0000 UTC m=+209.593435629" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.674008 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:12 crc kubenswrapper[4941]: E0307 06:55:12.675491 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:13.175470663 +0000 UTC m=+210.127836128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.758355 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-ljm6r" podStartSLOduration=175.75832639 podStartE2EDuration="2m55.75832639s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:12.71021421 +0000 UTC m=+209.662579675" watchObservedRunningTime="2026-03-07 06:55:12.75832639 +0000 UTC m=+209.710691875" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.776836 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:12 crc kubenswrapper[4941]: E0307 06:55:12.777126 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:13.277115073 +0000 UTC m=+210.229480528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.779628 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2"] Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.784705 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547774-2b2fh"] Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.836575 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn"] Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.837118 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" podStartSLOduration=174.837091124 podStartE2EDuration="2m54.837091124s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:12.83444273 +0000 UTC m=+209.786808205" watchObservedRunningTime="2026-03-07 06:55:12.837091124 +0000 UTC m=+209.789456589" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.881781 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:12 crc kubenswrapper[4941]: E0307 06:55:12.882498 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:13.382469628 +0000 UTC m=+210.334835093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.892422 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-ljm6r" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.915655 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.927040 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" podStartSLOduration=175.927006918 podStartE2EDuration="2m55.927006918s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:12.877352035 +0000 UTC m=+209.829717500" watchObservedRunningTime="2026-03-07 06:55:12.927006918 +0000 UTC m=+209.879372383" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.933659 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-w745p" podStartSLOduration=175.933634263 podStartE2EDuration="2m55.933634263s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:12.914340455 +0000 UTC m=+209.866705920" watchObservedRunningTime="2026-03-07 06:55:12.933634263 +0000 UTC m=+209.885999728" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.948643 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.953307 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-hvmvs" podStartSLOduration=175.95328288 podStartE2EDuration="2m55.95328288s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:12.941832031 +0000 UTC m=+209.894197516" watchObservedRunningTime="2026-03-07 06:55:12.95328288 +0000 UTC m=+209.905648345" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.964363 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:12 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:12 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:12 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.964447 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:12 crc kubenswrapper[4941]: I0307 06:55:12.983930 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:12 crc kubenswrapper[4941]: E0307 06:55:12.984439 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:13.484422977 +0000 UTC m=+210.436788442 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.021237 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c8tfd" podStartSLOduration=176.021210782 podStartE2EDuration="2m56.021210782s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:13.020818061 +0000 UTC m=+209.973183536" watchObservedRunningTime="2026-03-07 06:55:13.021210782 +0000 UTC m=+209.973576247" Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.031224 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nwzjs" podStartSLOduration=176.03119483 podStartE2EDuration="2m56.03119483s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:12.984713415 +0000 UTC m=+209.937078870" watchObservedRunningTime="2026-03-07 06:55:13.03119483 +0000 UTC m=+209.983560295" Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.070794 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8qnmz" podStartSLOduration=176.070773252 podStartE2EDuration="2m56.070773252s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:13.069131617 +0000 UTC m=+210.021497082" watchObservedRunningTime="2026-03-07 06:55:13.070773252 +0000 UTC m=+210.023138717" Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.121892 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:13 crc kubenswrapper[4941]: E0307 06:55:13.122757 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:13.622733859 +0000 UTC m=+210.575099324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.153047 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld"] Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.175290 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" podStartSLOduration=175.175258182 podStartE2EDuration="2m55.175258182s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:13.143435416 +0000 UTC m=+210.095800881" watchObservedRunningTime="2026-03-07 06:55:13.175258182 +0000 UTC m=+210.127623647" Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.183342 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lhtqh" podStartSLOduration=176.183313117 podStartE2EDuration="2m56.183313117s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:13.1730123 +0000 UTC m=+210.125377765" watchObservedRunningTime="2026-03-07 06:55:13.183313117 +0000 UTC m=+210.135678602" Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.231839 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:13 crc kubenswrapper[4941]: E0307 06:55:13.232223 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:13.732208948 +0000 UTC m=+210.684574413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.341978 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:13 crc kubenswrapper[4941]: E0307 06:55:13.342576 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:13.842554282 +0000 UTC m=+210.794919747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.444278 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:13 crc kubenswrapper[4941]: E0307 06:55:13.445206 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:13.94519241 +0000 UTC m=+210.897557875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.545806 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:13 crc kubenswrapper[4941]: E0307 06:55:13.546697 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:14.046671667 +0000 UTC m=+210.999037122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.619528 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" event={"ID":"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a","Type":"ContainerStarted","Data":"9b5cdd51d8c405799211250d5cc3a530f3eee72d277f5b885b7a415275776bcb"} Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.630732 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" event={"ID":"fb104051-0f66-4faf-95f5-a7a28d41cdc2","Type":"ContainerStarted","Data":"024a9570727ccafe97eff5785ef4fe309be6e1866c2afb7f6dcaaedc70489bc1"} Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.651949 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:13 crc kubenswrapper[4941]: E0307 06:55:13.652495 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:14.152472403 +0000 UTC m=+211.104838058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.672224 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" event={"ID":"fdca2db2-c710-4685-9033-fbbe40f73076","Type":"ContainerStarted","Data":"e1654ddeb4f943c9954fd346aac51e4b052eadfdb00a8bb7490e3aab97da5c80"} Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.700209 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" event={"ID":"a7c60c91-094d-4c52-9dcb-36ad07c829ad","Type":"ContainerStarted","Data":"27eb66b7156bf10345d4ec21da39d800dc751f51f673fc301b465238a7988290"} Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.725887 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" event={"ID":"5aaa48c5-b837-4d14-b2fa-dc1dd37a13f0","Type":"ContainerStarted","Data":"713da5c3496e922acc731be4f0d4e0a2fac7f1f5b3434eea51392d95aaf39ea7"} Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.733295 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-5bq7n" podStartSLOduration=175.733268674 podStartE2EDuration="2m55.733268674s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:13.731277338 +0000 UTC m=+210.683642823" watchObservedRunningTime="2026-03-07 06:55:13.733268674 +0000 UTC m=+210.685634159" Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.736716 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" event={"ID":"6cedfec9-da7b-4080-894f-dbc3b3dbad46","Type":"ContainerStarted","Data":"cf559860df14dc5797fe1661033a0758d0e8578fdbadbc10895bdb4f43879676"} Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.753972 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.755633 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" event={"ID":"b14f5741-27aa-4ff2-a2dc-69c385a07e16","Type":"ContainerStarted","Data":"975f35c63ee52df16e478b698fbb9eac008c9b3ef611544cc88c80e6d34a69e9"} Mar 07 06:55:13 crc kubenswrapper[4941]: E0307 06:55:13.756585 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:14.256551782 +0000 UTC m=+211.208917247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.781970 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" event={"ID":"48b6255a-3390-4e84-bed2-6a28fc0c9800","Type":"ContainerStarted","Data":"1e23736f81a84a6697d29ef86b7b63ddde885d0aed2055660a39702da7b63cf1"} Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.782375 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.783004 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-29stc" podStartSLOduration=175.782988578 podStartE2EDuration="2m55.782988578s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:13.774625686 +0000 UTC m=+210.726991161" watchObservedRunningTime="2026-03-07 06:55:13.782988578 +0000 UTC m=+210.735354043" Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.807083 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hr792" podStartSLOduration=176.807058209 podStartE2EDuration="2m56.807058209s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:13.806266477 +0000 UTC m=+210.758631942" watchObservedRunningTime="2026-03-07 06:55:13.807058209 +0000 UTC m=+210.759423674" Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.822242 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" event={"ID":"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8","Type":"ContainerStarted","Data":"2a757197cd9d9e770acaee3e275f622adbd897f0756381b1b1cda68d2dab0b69"} Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.868703 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:13 crc kubenswrapper[4941]: E0307 06:55:13.873900 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:14.37387776 +0000 UTC m=+211.326243225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.888872 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" event={"ID":"dc64ec5b-3c7b-47de-8a6f-6c2555ba6465","Type":"ContainerStarted","Data":"8d608fbd317fe848a622cde29bad876e2063f4ca2c656ef5cd3025d9ea1df3ac"} Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.889893 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" podStartSLOduration=176.889869555 podStartE2EDuration="2m56.889869555s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:13.889255508 +0000 UTC m=+210.841620993" watchObservedRunningTime="2026-03-07 06:55:13.889869555 +0000 UTC m=+210.842235020" Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.963469 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:13 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:13 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:13 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.963574 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:13 crc kubenswrapper[4941]: I0307 06:55:13.970711 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:13 crc kubenswrapper[4941]: E0307 06:55:13.979776 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:14.479747358 +0000 UTC m=+211.432112903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.070655 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" event={"ID":"0bccd3b5-9ee1-4aa3-90e6-220c7a1a9cf8","Type":"ContainerStarted","Data":"56d3e3d666b62c61db4bab35ceff190f1bfe6b343c41b624e217d6b43cbbbc02"} Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.070702 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-x2rhs" event={"ID":"f2470875-024a-4ef0-9e01-20bbbfff60bc","Type":"ContainerStarted","Data":"c1a9780de88fc06dfa563b68077907fa34122ceb9cfd7c7c3ef286d6286d5c6d"} Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.075534 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vsntc" event={"ID":"b6ad282e-3374-4a34-8956-252c6196274d","Type":"ContainerStarted","Data":"597baeb20bbfd136aea3d912ca05581b4cc5e532ba993201c1c240725240e682"} Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.080079 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:14 crc kubenswrapper[4941]: E0307 06:55:14.080826 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:14.580807833 +0000 UTC m=+211.533173298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.102359 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" event={"ID":"baf7bbe6-5859-4df3-9164-a62bb2333078","Type":"ContainerStarted","Data":"23b34134dbf17c1b57890da59896a149df951633d5bb2eb3656826fd43df1f99"} Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.104097 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.124243 4941 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7x6zc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.124291 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" podUID="baf7bbe6-5859-4df3-9164-a62bb2333078" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.140213 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5tr44" event={"ID":"a77c6084-94de-4ebc-9a75-a83efa28b094","Type":"ContainerStarted","Data":"873ec81fddc92445e44308c67cb3c21ff0087fe62ec875849dbe1245a3902040"} Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.141099 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5tr44" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.163064 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.163134 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.183556 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:14 crc kubenswrapper[4941]: E0307 06:55:14.183725 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:14.683696299 +0000 UTC m=+211.636061764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.184159 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:14 crc kubenswrapper[4941]: E0307 06:55:14.188198 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:14.688179544 +0000 UTC m=+211.640545009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.215542 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-skhtt" event={"ID":"fe2bbec6-3204-4ad3-bbcc-bdc27060c44b","Type":"ContainerStarted","Data":"e92502b31f7fdc5b17f999ed4e85af08b1843a82e3b1e10c059ec785c84cb6bc"} Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.264122 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" event={"ID":"755ab720-3d7a-4d86-8613-147ac07a93dd","Type":"ContainerStarted","Data":"8e12405ef576a0fc9f8b9b3c7fdc0c02239990055304c8c28d6d2d53a591bc63"} Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.264214 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m9vbd" event={"ID":"25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17","Type":"ContainerStarted","Data":"d383cd4f8c3aa7a4e94f7670ca03640bce795cfda86c64615b77fec2469bad9a"} Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.303213 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" event={"ID":"612ac789-5007-4e17-a81a-cf753c2acadc","Type":"ContainerStarted","Data":"c06b09c5153c27b50b6ce6200c22c54548df3ddd58ab14869b8f22d23710cb0b"} Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.308719 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:14 crc kubenswrapper[4941]: E0307 06:55:14.305381 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:14.805366957 +0000 UTC m=+211.757732422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.305318 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.309565 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:14 crc kubenswrapper[4941]: E0307 06:55:14.311771 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:14.811756085 +0000 UTC m=+211.764121550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.336166 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8lsvm" event={"ID":"dc1105d7-08aa-4d81-9bb1-e162f649529d","Type":"ContainerStarted","Data":"e49007efb99e7c31a1b8b0b59c55564fa8e3d67d16109f303eedf97629798a77"} Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.357045 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" event={"ID":"45977497-a17d-40f6-bf09-22aff4a45738","Type":"ContainerStarted","Data":"7cfe3e7dadcddaac85f433932c3000e40de7a1564d6ebe2c3d3eaa3550a7c48f"} Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.406627 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" event={"ID":"07b5dbbc-b835-437d-bc58-1f4dc88e48bc","Type":"ContainerStarted","Data":"a5e16d8415be3264ca09a3c93d0e969f6d0117ca8d637c416687513e1d7e572b"} Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.410821 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547774-2b2fh" event={"ID":"7566a1ff-1f6b-4e99-903b-ff036f98c411","Type":"ContainerStarted","Data":"b454bce4a04d2624d7259e6a84ee526893d699dd826268c0d71c8547ea37d33f"} Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.415050 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:14 crc kubenswrapper[4941]: E0307 06:55:14.417490 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:14.9174714 +0000 UTC m=+211.869836865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.424544 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vzg6" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.427851 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.473630 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-w9tck" podStartSLOduration=177.473600183 podStartE2EDuration="2m57.473600183s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:14.466947688 +0000 UTC m=+211.419313153" watchObservedRunningTime="2026-03-07 06:55:14.473600183 +0000 UTC m=+211.425965648" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.474624 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8lsvm" podStartSLOduration=7.474617591 podStartE2EDuration="7.474617591s" podCreationTimestamp="2026-03-07 06:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:14.436804188 +0000 UTC m=+211.389169653" watchObservedRunningTime="2026-03-07 06:55:14.474617591 +0000 UTC m=+211.426983066" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.517899 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:14 crc kubenswrapper[4941]: E0307 06:55:14.525075 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:15.025055336 +0000 UTC m=+211.977420881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.584286 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" podStartSLOduration=177.584263305 podStartE2EDuration="2m57.584263305s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:14.579689378 +0000 UTC m=+211.532054863" watchObservedRunningTime="2026-03-07 06:55:14.584263305 +0000 UTC m=+211.536628770" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.623086 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:14 crc kubenswrapper[4941]: E0307 06:55:14.623705 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:15.123686823 +0000 UTC m=+212.076052288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.632292 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-skhtt" podStartSLOduration=7.632251342 podStartE2EDuration="7.632251342s" podCreationTimestamp="2026-03-07 06:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:14.631639325 +0000 UTC m=+211.584004810" watchObservedRunningTime="2026-03-07 06:55:14.632251342 +0000 UTC m=+211.584616807" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.665630 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qhnm7" podStartSLOduration=177.66559787 podStartE2EDuration="2m57.66559787s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:14.66487922 +0000 UTC m=+211.617244685" watchObservedRunningTime="2026-03-07 06:55:14.66559787 +0000 UTC m=+211.617963335" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.707022 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-5tr44" podStartSLOduration=177.706996913 podStartE2EDuration="2m57.706996913s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:14.702260171 +0000 UTC m=+211.654625646" watchObservedRunningTime="2026-03-07 06:55:14.706996913 +0000 UTC m=+211.659362378" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.724283 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:14 crc kubenswrapper[4941]: E0307 06:55:14.725531 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:15.225514929 +0000 UTC m=+212.177880394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.763531 4941 ???:1] "http: TLS handshake error from 192.168.126.11:46200: no serving certificate available for the kubelet" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.809182 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-x2rhs" podStartSLOduration=177.809161169 podStartE2EDuration="2m57.809161169s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:14.764162716 +0000 UTC m=+211.716528191" watchObservedRunningTime="2026-03-07 06:55:14.809161169 +0000 UTC m=+211.761526634" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.825874 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:14 crc kubenswrapper[4941]: E0307 06:55:14.826665 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:15.326645206 +0000 UTC m=+212.279010671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.865654 4941 ???:1] "http: TLS handshake error from 192.168.126.11:46212: no serving certificate available for the kubelet" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.874889 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" podStartSLOduration=176.874848728 podStartE2EDuration="2m56.874848728s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:14.813086078 +0000 UTC m=+211.765451543" watchObservedRunningTime="2026-03-07 06:55:14.874848728 +0000 UTC m=+211.827214203" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.914636 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" podStartSLOduration=176.914610926 podStartE2EDuration="2m56.914610926s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:14.851184359 +0000 UTC m=+211.803549834" watchObservedRunningTime="2026-03-07 06:55:14.914610926 +0000 UTC m=+211.866976391" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.934424 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:14 crc kubenswrapper[4941]: E0307 06:55:14.934895 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:15.434875609 +0000 UTC m=+212.387241074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.966137 4941 ???:1] "http: TLS handshake error from 192.168.126.11:46222: no serving certificate available for the kubelet" Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.966729 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:14 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:14 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:14 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:14 crc kubenswrapper[4941]: I0307 06:55:14.966770 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.038100 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:15 crc kubenswrapper[4941]: E0307 06:55:15.039397 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:15.53937005 +0000 UTC m=+212.491735515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.136862 4941 ???:1] "http: TLS handshake error from 192.168.126.11:46232: no serving certificate available for the kubelet" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.142598 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:15 crc kubenswrapper[4941]: E0307 06:55:15.143283 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:15.643268413 +0000 UTC m=+212.595633878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.204981 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.240397 4941 ???:1] "http: TLS handshake error from 192.168.126.11:46234: no serving certificate available for the kubelet" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.249773 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z4zqp"] Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.250859 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.252654 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:15 crc kubenswrapper[4941]: E0307 06:55:15.252928 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:15.752910837 +0000 UTC m=+212.705276302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.254523 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.275971 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z4zqp"] Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.320659 4941 ???:1] "http: TLS handshake error from 192.168.126.11:46238: no serving certificate available for the kubelet" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.354135 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805b56ac-66fd-4704-adb1-f3968f17f835-catalog-content\") pod \"certified-operators-z4zqp\" (UID: \"805b56ac-66fd-4704-adb1-f3968f17f835\") " pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.354240 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.354291 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805b56ac-66fd-4704-adb1-f3968f17f835-utilities\") pod \"certified-operators-z4zqp\" (UID: \"805b56ac-66fd-4704-adb1-f3968f17f835\") " pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.354332 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cs5v\" (UniqueName: \"kubernetes.io/projected/805b56ac-66fd-4704-adb1-f3968f17f835-kube-api-access-4cs5v\") pod \"certified-operators-z4zqp\" (UID: \"805b56ac-66fd-4704-adb1-f3968f17f835\") " pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:55:15 crc kubenswrapper[4941]: E0307 06:55:15.354797 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:15.854779914 +0000 UTC m=+212.807145379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.414998 4941 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cp52v container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.415082 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" podUID="37fe4f8d-b1a8-4848-805b-f44095a2daeb" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.15:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.425853 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qx5dk"] Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.427187 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.438257 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.439168 4941 ???:1] "http: TLS handshake error from 192.168.126.11:46252: no serving certificate available for the kubelet" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.458733 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.459020 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cs5v\" (UniqueName: \"kubernetes.io/projected/805b56ac-66fd-4704-adb1-f3968f17f835-kube-api-access-4cs5v\") pod \"certified-operators-z4zqp\" (UID: \"805b56ac-66fd-4704-adb1-f3968f17f835\") " pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.459063 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-254k7\" (UniqueName: \"kubernetes.io/projected/bdb71b40-ad9b-405b-a178-158109d65a92-kube-api-access-254k7\") pod \"community-operators-qx5dk\" (UID: \"bdb71b40-ad9b-405b-a178-158109d65a92\") " pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.459100 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb71b40-ad9b-405b-a178-158109d65a92-catalog-content\") pod \"community-operators-qx5dk\" (UID: \"bdb71b40-ad9b-405b-a178-158109d65a92\") " pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.459133 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805b56ac-66fd-4704-adb1-f3968f17f835-catalog-content\") pod \"certified-operators-z4zqp\" (UID: \"805b56ac-66fd-4704-adb1-f3968f17f835\") " pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.459263 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805b56ac-66fd-4704-adb1-f3968f17f835-utilities\") pod \"certified-operators-z4zqp\" (UID: \"805b56ac-66fd-4704-adb1-f3968f17f835\") " pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.459293 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb71b40-ad9b-405b-a178-158109d65a92-utilities\") pod \"community-operators-qx5dk\" (UID: \"bdb71b40-ad9b-405b-a178-158109d65a92\") " pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:55:15 crc kubenswrapper[4941]: E0307 06:55:15.459523 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:15.959494191 +0000 UTC m=+212.911859656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.460946 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805b56ac-66fd-4704-adb1-f3968f17f835-catalog-content\") pod \"certified-operators-z4zqp\" (UID: \"805b56ac-66fd-4704-adb1-f3968f17f835\") " pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.461181 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805b56ac-66fd-4704-adb1-f3968f17f835-utilities\") pod \"certified-operators-z4zqp\" (UID: \"805b56ac-66fd-4704-adb1-f3968f17f835\") " pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.462479 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v76pm"] Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.480486 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qx5dk"] Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.502273 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csdsw" event={"ID":"45977497-a17d-40f6-bf09-22aff4a45738","Type":"ContainerStarted","Data":"a8e9b0d64018705bc9bde554ff49907520b8ba7967529d3acaf648cd6b01bc88"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.505835 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" event={"ID":"fdca2db2-c710-4685-9033-fbbe40f73076","Type":"ContainerStarted","Data":"68354b85f01ee6b9096807eb08a8a31362a8f7e86595f25e895280aceaea3f64"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.506608 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.527743 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" event={"ID":"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb","Type":"ContainerStarted","Data":"256668fde66ac50d195569767d11e7f22581a583b8cc45738819061a4e3737e2"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.530907 4941 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-zm7wn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.532223 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cs5v\" (UniqueName: \"kubernetes.io/projected/805b56ac-66fd-4704-adb1-f3968f17f835-kube-api-access-4cs5v\") pod \"certified-operators-z4zqp\" (UID: \"805b56ac-66fd-4704-adb1-f3968f17f835\") " pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.532354 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" podUID="fdca2db2-c710-4685-9033-fbbe40f73076" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.554602 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" event={"ID":"a5d1706b-179e-4ffd-a2af-e62d05e1e36d","Type":"ContainerStarted","Data":"9b040850e5c6fbb14c5f4c82d3d0dca1408ac0e6a2d14f3ec4f60f1a8565e05d"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.557369 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" event={"ID":"fb104051-0f66-4faf-95f5-a7a28d41cdc2","Type":"ContainerStarted","Data":"772071dd768103d95e6bf1839fbe155826d72f1a7822993e963c63d9d4544d63"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.564372 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.564536 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb71b40-ad9b-405b-a178-158109d65a92-utilities\") pod \"community-operators-qx5dk\" (UID: \"bdb71b40-ad9b-405b-a178-158109d65a92\") " pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.564621 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-254k7\" (UniqueName: \"kubernetes.io/projected/bdb71b40-ad9b-405b-a178-158109d65a92-kube-api-access-254k7\") pod \"community-operators-qx5dk\" (UID: \"bdb71b40-ad9b-405b-a178-158109d65a92\") " pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.564650 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb71b40-ad9b-405b-a178-158109d65a92-catalog-content\") pod \"community-operators-qx5dk\" (UID: \"bdb71b40-ad9b-405b-a178-158109d65a92\") " pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:55:15 crc kubenswrapper[4941]: E0307 06:55:15.566526 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:16.066509631 +0000 UTC m=+213.018875096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.569327 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb71b40-ad9b-405b-a178-158109d65a92-catalog-content\") pod \"community-operators-qx5dk\" (UID: \"bdb71b40-ad9b-405b-a178-158109d65a92\") " pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.579471 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb71b40-ad9b-405b-a178-158109d65a92-utilities\") pod \"community-operators-qx5dk\" (UID: \"bdb71b40-ad9b-405b-a178-158109d65a92\") " pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.588552 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" event={"ID":"dc64ec5b-3c7b-47de-8a6f-6c2555ba6465","Type":"ContainerStarted","Data":"7609890d146506588b27717c83c154f828385966663b6f35a1f195f64aeb58af"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.588633 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" event={"ID":"dc64ec5b-3c7b-47de-8a6f-6c2555ba6465","Type":"ContainerStarted","Data":"e6aa9d15686f4b735c0f825cd2923dfb08ea1ab768953d5a11662fc78a082682"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.589152 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.599847 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-254k7\" (UniqueName: \"kubernetes.io/projected/bdb71b40-ad9b-405b-a178-158109d65a92-kube-api-access-254k7\") pod \"community-operators-qx5dk\" (UID: \"bdb71b40-ad9b-405b-a178-158109d65a92\") " pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.602983 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.645629 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" podStartSLOduration=177.645605604 podStartE2EDuration="2m57.645605604s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:15.64401196 +0000 UTC m=+212.596377425" watchObservedRunningTime="2026-03-07 06:55:15.645605604 +0000 UTC m=+212.597971079" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.651040 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" event={"ID":"743d3e91-514c-4323-9fce-9d29d4d6e816","Type":"ContainerStarted","Data":"8f2f85c0eb03a36ccdee37557f46b2eb0d01076a32f3012eadf10a9e986830d0"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.669150 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:15 crc kubenswrapper[4941]: E0307 06:55:15.670655 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:16.170621441 +0000 UTC m=+213.122987086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.675617 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cb2gw" event={"ID":"b0b8ae0c-7ebc-4db9-a622-ab41e63e0690","Type":"ContainerStarted","Data":"35c44be5debdae4f1159bc2ef0d97633cd8fdfaf3a9bff3031091076365a9043"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.675703 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cb2gw" event={"ID":"b0b8ae0c-7ebc-4db9-a622-ab41e63e0690","Type":"ContainerStarted","Data":"a205949b00c661b6e654413b846bace09446b9f2e5a754679c0d8e73eb2b6cb6"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.692969 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m9vbd" event={"ID":"25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17","Type":"ContainerStarted","Data":"e2c96dfe0d93502a7684c1b50420cedf10a02e0c58f4a9fd0f6d9ffce645df88"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.698029 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jg549"] Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.699318 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.738043 4941 ???:1] "http: TLS handshake error from 192.168.126.11:46258: no serving certificate available for the kubelet" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.743756 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" event={"ID":"007a3578-3ca3-4979-aa06-b2ffe3c7718e","Type":"ContainerStarted","Data":"ebc74596429ac7e755277fba8452969349264c77b62507d1e07265c8b20859f7"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.744664 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.757559 4941 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-75pn8 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.757631 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" podUID="007a3578-3ca3-4979-aa06-b2ffe3c7718e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.766841 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8sj2" podStartSLOduration=178.76681616 podStartE2EDuration="2m58.76681616s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:15.720902951 +0000 UTC m=+212.673268426" watchObservedRunningTime="2026-03-07 06:55:15.76681616 +0000 UTC m=+212.719181635" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.769949 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jg549"] Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.772309 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86719fee-4b62-4f53-958e-9e87f56a9062-catalog-content\") pod \"certified-operators-jg549\" (UID: \"86719fee-4b62-4f53-958e-9e87f56a9062\") " pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.772462 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86719fee-4b62-4f53-958e-9e87f56a9062-utilities\") pod \"certified-operators-jg549\" (UID: \"86719fee-4b62-4f53-958e-9e87f56a9062\") " pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.772505 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.772558 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4pcp\" (UniqueName: \"kubernetes.io/projected/86719fee-4b62-4f53-958e-9e87f56a9062-kube-api-access-x4pcp\") pod \"certified-operators-jg549\" (UID: \"86719fee-4b62-4f53-958e-9e87f56a9062\") " pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:55:15 crc kubenswrapper[4941]: E0307 06:55:15.774352 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:16.274335849 +0000 UTC m=+213.226701314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.783911 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" podStartSLOduration=178.783885445 podStartE2EDuration="2m58.783885445s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:15.782110806 +0000 UTC m=+212.734476271" watchObservedRunningTime="2026-03-07 06:55:15.783885445 +0000 UTC m=+212.736250910" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.794306 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v" event={"ID":"9beea563-7739-4b17-b360-bb769400bdff","Type":"ContainerStarted","Data":"175c08a287fe022e0fcf4ffe0c0ff5b6c83f4cfe62c08192bb34a50f7b6889a4"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.794881 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.831575 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn"] Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.844359 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" podStartSLOduration=177.844335359 podStartE2EDuration="2m57.844335359s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:15.842328833 +0000 UTC m=+212.794694298" watchObservedRunningTime="2026-03-07 06:55:15.844335359 +0000 UTC m=+212.796700824" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.859719 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" event={"ID":"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a","Type":"ContainerStarted","Data":"8934c67577997a9e360c835ef46642be590295552683c5de23ce34e22c763ab5"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.859793 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" event={"ID":"0b95d3f5-cb4a-4b38-9b99-bf07e48b0b1a","Type":"ContainerStarted","Data":"7def69072996663a0684261ff241c85103b87db6e8b13bda7585f35b62b842d7"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.874075 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.874453 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86719fee-4b62-4f53-958e-9e87f56a9062-utilities\") pod \"certified-operators-jg549\" (UID: \"86719fee-4b62-4f53-958e-9e87f56a9062\") " pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.874537 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4pcp\" (UniqueName: \"kubernetes.io/projected/86719fee-4b62-4f53-958e-9e87f56a9062-kube-api-access-x4pcp\") pod \"certified-operators-jg549\" (UID: \"86719fee-4b62-4f53-958e-9e87f56a9062\") " pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.874623 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86719fee-4b62-4f53-958e-9e87f56a9062-catalog-content\") pod \"certified-operators-jg549\" (UID: \"86719fee-4b62-4f53-958e-9e87f56a9062\") " pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.875172 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86719fee-4b62-4f53-958e-9e87f56a9062-catalog-content\") pod \"certified-operators-jg549\" (UID: \"86719fee-4b62-4f53-958e-9e87f56a9062\") " pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:55:15 crc kubenswrapper[4941]: E0307 06:55:15.875266 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:16.37524411 +0000 UTC m=+213.327609575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.876294 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86719fee-4b62-4f53-958e-9e87f56a9062-utilities\") pod \"certified-operators-jg549\" (UID: \"86719fee-4b62-4f53-958e-9e87f56a9062\") " pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.876475 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xsszj"] Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.905187 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.938880 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" event={"ID":"baf7bbe6-5859-4df3-9164-a62bb2333078","Type":"ContainerStarted","Data":"00dd23d15607af828fd94937288ed330bed79fbe736af6984249875b6aa4eb04"} Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.947912 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4pcp\" (UniqueName: \"kubernetes.io/projected/86719fee-4b62-4f53-958e-9e87f56a9062-kube-api-access-x4pcp\") pod \"certified-operators-jg549\" (UID: \"86719fee-4b62-4f53-958e-9e87f56a9062\") " pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.950395 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xsszj"] Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.962773 4941 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7x6zc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.963135 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" podUID="baf7bbe6-5859-4df3-9164-a62bb2333078" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.972127 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:15 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:15 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:15 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.972213 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.976567 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715e8d60-13c8-442f-bec0-2f2fd1cfe172-utilities\") pod \"community-operators-xsszj\" (UID: \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\") " pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.976652 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qhtv\" (UniqueName: \"kubernetes.io/projected/715e8d60-13c8-442f-bec0-2f2fd1cfe172-kube-api-access-8qhtv\") pod \"community-operators-xsszj\" (UID: \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\") " pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.976694 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:15 crc kubenswrapper[4941]: I0307 06:55:15.976727 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715e8d60-13c8-442f-bec0-2f2fd1cfe172-catalog-content\") pod \"community-operators-xsszj\" (UID: \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\") " pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:55:15 crc kubenswrapper[4941]: E0307 06:55:15.978716 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:16.478694151 +0000 UTC m=+213.431059616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.022637 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-cb2gw" podStartSLOduration=178.022615634 podStartE2EDuration="2m58.022615634s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:16.022376518 +0000 UTC m=+212.974741983" watchObservedRunningTime="2026-03-07 06:55:16.022615634 +0000 UTC m=+212.974981099" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.032635 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" event={"ID":"07b5dbbc-b835-437d-bc58-1f4dc88e48bc","Type":"ContainerStarted","Data":"553e164cd79be5a2b18f54b534bfbb1a78b2aed491f90c847c551ae0ef720f3e"} Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.032670 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" event={"ID":"07b5dbbc-b835-437d-bc58-1f4dc88e48bc","Type":"ContainerStarted","Data":"6116abbd85c7fad9c245fba7b06687981e092ddc543c0d4cceec980cb2db7fa0"} Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.032688 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" event={"ID":"6cedfec9-da7b-4080-894f-dbc3b3dbad46","Type":"ContainerStarted","Data":"9e0581fdfdfca37e11fdce82f1b1d91183307ec1efda08f1cf00af97b207e32f"} Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.032699 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" event={"ID":"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8","Type":"ContainerDied","Data":"2a757197cd9d9e770acaee3e275f622adbd897f0756381b1b1cda68d2dab0b69"} Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.027049 4941 generic.go:334] "Generic (PLEG): container finished" podID="1f1180d1-6136-4fa5-9c10-7c4a25d4dff8" containerID="2a757197cd9d9e770acaee3e275f622adbd897f0756381b1b1cda68d2dab0b69" exitCode=0 Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.034583 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" event={"ID":"1f1180d1-6136-4fa5-9c10-7c4a25d4dff8","Type":"ContainerStarted","Data":"baf3f0cea3f5ed10baefd2a8f8286788ee3fd80cf864d6d4321638c3b7d36887"} Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.035657 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.035707 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.084500 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.084781 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:55:16 crc kubenswrapper[4941]: E0307 06:55:16.085256 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:16.585076024 +0000 UTC m=+213.537441499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.085334 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715e8d60-13c8-442f-bec0-2f2fd1cfe172-utilities\") pod \"community-operators-xsszj\" (UID: \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\") " pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.085553 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qhtv\" (UniqueName: \"kubernetes.io/projected/715e8d60-13c8-442f-bec0-2f2fd1cfe172-kube-api-access-8qhtv\") pod \"community-operators-xsszj\" (UID: \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\") " pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.085612 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.085689 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715e8d60-13c8-442f-bec0-2f2fd1cfe172-catalog-content\") pod \"community-operators-xsszj\" (UID: \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\") " pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.086534 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" podStartSLOduration=178.086518124 podStartE2EDuration="2m58.086518124s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:16.084299242 +0000 UTC m=+213.036664717" watchObservedRunningTime="2026-03-07 06:55:16.086518124 +0000 UTC m=+213.038883589" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.087658 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ccz2v" podStartSLOduration=179.087647846 podStartE2EDuration="2m59.087647846s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:16.054802531 +0000 UTC m=+213.007168006" watchObservedRunningTime="2026-03-07 06:55:16.087647846 +0000 UTC m=+213.040013311" Mar 07 06:55:16 crc kubenswrapper[4941]: E0307 06:55:16.090578 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:16.590552026 +0000 UTC m=+213.542917491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.100480 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715e8d60-13c8-442f-bec0-2f2fd1cfe172-catalog-content\") pod \"community-operators-xsszj\" (UID: \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\") " pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.103872 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715e8d60-13c8-442f-bec0-2f2fd1cfe172-utilities\") pod \"community-operators-xsszj\" (UID: \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\") " pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.185775 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qhtv\" (UniqueName: \"kubernetes.io/projected/715e8d60-13c8-442f-bec0-2f2fd1cfe172-kube-api-access-8qhtv\") pod \"community-operators-xsszj\" (UID: \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\") " pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.201014 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:16 crc kubenswrapper[4941]: E0307 06:55:16.209296 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:16.709266223 +0000 UTC m=+213.661631688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.211337 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2kb4" podStartSLOduration=179.21131907 podStartE2EDuration="2m59.21131907s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:16.170282027 +0000 UTC m=+213.122647492" watchObservedRunningTime="2026-03-07 06:55:16.21131907 +0000 UTC m=+213.163684535" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.241587 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rxqwq" podStartSLOduration=178.241553942 podStartE2EDuration="2m58.241553942s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:16.234488475 +0000 UTC m=+213.186854010" watchObservedRunningTime="2026-03-07 06:55:16.241553942 +0000 UTC m=+213.193919407" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.293565 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-8tds9" podStartSLOduration=178.29354131 podStartE2EDuration="2m58.29354131s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:16.291048401 +0000 UTC m=+213.243413886" watchObservedRunningTime="2026-03-07 06:55:16.29354131 +0000 UTC m=+213.245906785" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.309242 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6959d" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.309830 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:16 crc kubenswrapper[4941]: E0307 06:55:16.310286 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:16.810271106 +0000 UTC m=+213.762636571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.321625 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.414761 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:16 crc kubenswrapper[4941]: E0307 06:55:16.415170 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:16.915150977 +0000 UTC m=+213.867516442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.462003 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z4zqp"] Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.512842 4941 ???:1] "http: TLS handshake error from 192.168.126.11:46270: no serving certificate available for the kubelet" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.515738 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:16 crc kubenswrapper[4941]: E0307 06:55:16.516117 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.016102049 +0000 UTC m=+213.968467514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.540387 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" podStartSLOduration=178.540358214 podStartE2EDuration="2m58.540358214s" podCreationTimestamp="2026-03-07 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:16.539788818 +0000 UTC m=+213.492154283" watchObservedRunningTime="2026-03-07 06:55:16.540358214 +0000 UTC m=+213.492723679" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.542430 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mg5ld" podStartSLOduration=179.542417252 podStartE2EDuration="2m59.542417252s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:16.478786239 +0000 UTC m=+213.431151724" watchObservedRunningTime="2026-03-07 06:55:16.542417252 +0000 UTC m=+213.494782717" Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.620466 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:16 crc kubenswrapper[4941]: E0307 06:55:16.621095 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.121066722 +0000 UTC m=+214.073432187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.723125 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:16 crc kubenswrapper[4941]: E0307 06:55:16.723553 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.223540746 +0000 UTC m=+214.175906211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.749197 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qx5dk"] Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.823790 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:16 crc kubenswrapper[4941]: E0307 06:55:16.824151 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.324129008 +0000 UTC m=+214.276494493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.924920 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:16 crc kubenswrapper[4941]: E0307 06:55:16.925385 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.425373478 +0000 UTC m=+214.377738943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.949862 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:16 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:16 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:16 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:16 crc kubenswrapper[4941]: I0307 06:55:16.949922 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.028663 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:17 crc kubenswrapper[4941]: E0307 06:55:17.030050 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.530030532 +0000 UTC m=+214.482395987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.056906 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" event={"ID":"a5d1706b-179e-4ffd-a2af-e62d05e1e36d","Type":"ContainerStarted","Data":"b19fb273733c51c50adc8160df4e515719904098e066cf7f6b2d1f9e92870d39"} Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.064755 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m9vbd" event={"ID":"25f8012e-e8a6-4bb9-b68c-b0b3d8a81d17","Type":"ContainerStarted","Data":"a1ed16c445e64ddf9be5412dfbbaa8a20af2b963c406be49e70e77428faf33d4"} Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.066569 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-m9vbd" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.069704 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qx5dk" event={"ID":"bdb71b40-ad9b-405b-a178-158109d65a92","Type":"ContainerStarted","Data":"6c1ec96b1642fa62a471bb9082b309cacc8b442db698be53b878d706c1ad8d95"} Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.095978 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jg549"] Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.099269 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vsntc" event={"ID":"b6ad282e-3374-4a34-8956-252c6196274d","Type":"ContainerStarted","Data":"b51e52d624b0c2a78a34d43618a43c3d5f6b8ac2ba041b2bd42af9e2838b6efa"} Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.096719 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" podStartSLOduration=180.096691339 podStartE2EDuration="3m0.096691339s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:17.091513415 +0000 UTC m=+214.043878900" watchObservedRunningTime="2026-03-07 06:55:17.096691339 +0000 UTC m=+214.049056794" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.120944 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" podUID="bb8c0212-2a6d-4636-a75b-08a350f5948f" containerName="controller-manager" containerID="cri-o://c5b7d30f1a7540962614f7c1e8e985aadca3abca5d22e20cecf3923be1c81d26" gracePeriod=30 Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.121462 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4zqp" event={"ID":"805b56ac-66fd-4704-adb1-f3968f17f835","Type":"ContainerStarted","Data":"d781c5e24e57e991e7373b372f99c3036fbf6041cf926262dcdddba61dae5240"} Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.124095 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.130789 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.128763 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-75pn8" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.124218 4941 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7x6zc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.131142 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" podUID="baf7bbe6-5859-4df3-9164-a62bb2333078" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.137171 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.151574 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-m9vbd" podStartSLOduration=10.151556667 podStartE2EDuration="10.151556667s" podCreationTimestamp="2026-03-07 06:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:17.143797671 +0000 UTC m=+214.096163136" watchObservedRunningTime="2026-03-07 06:55:17.151556667 +0000 UTC m=+214.103922132" Mar 07 06:55:17 crc kubenswrapper[4941]: E0307 06:55:17.158513 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.65849051 +0000 UTC m=+214.610855975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.168631 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.180372 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xsszj"] Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.239575 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:17 crc kubenswrapper[4941]: E0307 06:55:17.240391 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.740366821 +0000 UTC m=+214.692732286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.345605 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:17 crc kubenswrapper[4941]: E0307 06:55:17.346504 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.846486976 +0000 UTC m=+214.798852441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.445692 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5v5x2"] Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.447038 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.447334 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:17 crc kubenswrapper[4941]: E0307 06:55:17.447760 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.947726336 +0000 UTC m=+214.900091801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.447869 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:17 crc kubenswrapper[4941]: E0307 06:55:17.448614 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:17.94860571 +0000 UTC m=+214.900971175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.468161 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.495060 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v5x2"] Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.550588 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.550928 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a10e3708-a476-4698-aa8d-ba99a795524a-catalog-content\") pod \"redhat-marketplace-5v5x2\" (UID: \"a10e3708-a476-4698-aa8d-ba99a795524a\") " pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.551051 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a10e3708-a476-4698-aa8d-ba99a795524a-utilities\") pod \"redhat-marketplace-5v5x2\" (UID: \"a10e3708-a476-4698-aa8d-ba99a795524a\") " pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.551103 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mz8n\" (UniqueName: \"kubernetes.io/projected/a10e3708-a476-4698-aa8d-ba99a795524a-kube-api-access-4mz8n\") pod \"redhat-marketplace-5v5x2\" (UID: \"a10e3708-a476-4698-aa8d-ba99a795524a\") " pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:55:17 crc kubenswrapper[4941]: E0307 06:55:17.551642 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:18.051620509 +0000 UTC m=+215.003985964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.653284 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a10e3708-a476-4698-aa8d-ba99a795524a-catalog-content\") pod \"redhat-marketplace-5v5x2\" (UID: \"a10e3708-a476-4698-aa8d-ba99a795524a\") " pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.653492 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.653574 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a10e3708-a476-4698-aa8d-ba99a795524a-utilities\") pod \"redhat-marketplace-5v5x2\" (UID: \"a10e3708-a476-4698-aa8d-ba99a795524a\") " pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.653647 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mz8n\" (UniqueName: \"kubernetes.io/projected/a10e3708-a476-4698-aa8d-ba99a795524a-kube-api-access-4mz8n\") pod \"redhat-marketplace-5v5x2\" (UID: \"a10e3708-a476-4698-aa8d-ba99a795524a\") " pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.654743 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a10e3708-a476-4698-aa8d-ba99a795524a-catalog-content\") pod \"redhat-marketplace-5v5x2\" (UID: \"a10e3708-a476-4698-aa8d-ba99a795524a\") " pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:55:17 crc kubenswrapper[4941]: E0307 06:55:17.654864 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:18.154840344 +0000 UTC m=+215.107206009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.655092 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a10e3708-a476-4698-aa8d-ba99a795524a-utilities\") pod \"redhat-marketplace-5v5x2\" (UID: \"a10e3708-a476-4698-aa8d-ba99a795524a\") " pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.707112 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mz8n\" (UniqueName: \"kubernetes.io/projected/a10e3708-a476-4698-aa8d-ba99a795524a-kube-api-access-4mz8n\") pod \"redhat-marketplace-5v5x2\" (UID: \"a10e3708-a476-4698-aa8d-ba99a795524a\") " pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.754609 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:17 crc kubenswrapper[4941]: E0307 06:55:17.754976 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:18.254951853 +0000 UTC m=+215.207317318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.779871 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.841096 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vtxk7"] Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.842479 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.856617 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qw9x\" (UniqueName: \"kubernetes.io/projected/37622fc0-c5dc-4e0d-848a-214bce293f7f-kube-api-access-7qw9x\") pod \"redhat-marketplace-vtxk7\" (UID: \"37622fc0-c5dc-4e0d-848a-214bce293f7f\") " pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.856736 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37622fc0-c5dc-4e0d-848a-214bce293f7f-catalog-content\") pod \"redhat-marketplace-vtxk7\" (UID: \"37622fc0-c5dc-4e0d-848a-214bce293f7f\") " pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.856937 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:17 crc kubenswrapper[4941]: E0307 06:55:17.857390 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:18.357367945 +0000 UTC m=+215.309733410 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.857622 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37622fc0-c5dc-4e0d-848a-214bce293f7f-utilities\") pod \"redhat-marketplace-vtxk7\" (UID: \"37622fc0-c5dc-4e0d-848a-214bce293f7f\") " pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.882386 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtxk7"] Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.897107 4941 ???:1] "http: TLS handshake error from 192.168.126.11:46278: no serving certificate available for the kubelet" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.948478 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:17 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:17 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:17 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.948549 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.960103 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:17 crc kubenswrapper[4941]: E0307 06:55:17.960560 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:18.460539438 +0000 UTC m=+215.412904913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.960629 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.960661 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.960707 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37622fc0-c5dc-4e0d-848a-214bce293f7f-utilities\") pod \"redhat-marketplace-vtxk7\" (UID: \"37622fc0-c5dc-4e0d-848a-214bce293f7f\") " pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.960767 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qw9x\" (UniqueName: \"kubernetes.io/projected/37622fc0-c5dc-4e0d-848a-214bce293f7f-kube-api-access-7qw9x\") pod \"redhat-marketplace-vtxk7\" (UID: \"37622fc0-c5dc-4e0d-848a-214bce293f7f\") " pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.960790 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37622fc0-c5dc-4e0d-848a-214bce293f7f-catalog-content\") pod \"redhat-marketplace-vtxk7\" (UID: \"37622fc0-c5dc-4e0d-848a-214bce293f7f\") " pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.960837 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.962076 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37622fc0-c5dc-4e0d-848a-214bce293f7f-utilities\") pod \"redhat-marketplace-vtxk7\" (UID: \"37622fc0-c5dc-4e0d-848a-214bce293f7f\") " pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:55:17 crc kubenswrapper[4941]: E0307 06:55:17.962361 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:18.462345889 +0000 UTC m=+215.414711354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.964532 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37622fc0-c5dc-4e0d-848a-214bce293f7f-catalog-content\") pod \"redhat-marketplace-vtxk7\" (UID: \"37622fc0-c5dc-4e0d-848a-214bce293f7f\") " pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.964677 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.973466 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:55:17 crc kubenswrapper[4941]: I0307 06:55:17.990878 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qw9x\" (UniqueName: \"kubernetes.io/projected/37622fc0-c5dc-4e0d-848a-214bce293f7f-kube-api-access-7qw9x\") pod \"redhat-marketplace-vtxk7\" (UID: \"37622fc0-c5dc-4e0d-848a-214bce293f7f\") " pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.066723 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.067031 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.067079 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.067168 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:55:18 crc kubenswrapper[4941]: E0307 06:55:18.067747 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:18.567723864 +0000 UTC m=+215.520089319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.083958 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.090330 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.090845 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80030e60-caa3-4aad-8b00-10f5143d9243-metrics-certs\") pod \"network-metrics-daemon-q9fpr\" (UID: \"80030e60-caa3-4aad-8b00-10f5143d9243\") " pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.180536 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q9fpr" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.181353 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.182852 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.184053 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:18 crc kubenswrapper[4941]: E0307 06:55:18.184674 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:18.684657821 +0000 UTC m=+215.637023286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.192152 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.251497 4941 generic.go:334] "Generic (PLEG): container finished" podID="5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb" containerID="256668fde66ac50d195569767d11e7f22581a583b8cc45738819061a4e3737e2" exitCode=0 Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.252477 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" event={"ID":"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb","Type":"ContainerDied","Data":"256668fde66ac50d195569767d11e7f22581a583b8cc45738819061a4e3737e2"} Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.257109 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 07 06:55:18 crc kubenswrapper[4941]: E0307 06:55:18.257668 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8c0212-2a6d-4636-a75b-08a350f5948f" containerName="controller-manager" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.257698 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8c0212-2a6d-4636-a75b-08a350f5948f" containerName="controller-manager" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.257936 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8c0212-2a6d-4636-a75b-08a350f5948f" containerName="controller-manager" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.258736 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.262166 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.262468 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.273756 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.276492 4941 generic.go:334] "Generic (PLEG): container finished" podID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" containerID="83a6b64fddd3d37f9bff1a6b5308c209ad5ca7a7d2a5c7cc289127b581440286" exitCode=0 Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.276610 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsszj" event={"ID":"715e8d60-13c8-442f-bec0-2f2fd1cfe172","Type":"ContainerDied","Data":"83a6b64fddd3d37f9bff1a6b5308c209ad5ca7a7d2a5c7cc289127b581440286"} Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.276644 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsszj" event={"ID":"715e8d60-13c8-442f-bec0-2f2fd1cfe172","Type":"ContainerStarted","Data":"35bce658649352d6294ec01b46c93c6ca0f9142a68d0b106ef69d00733c03f14"} Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.283083 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.296844 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.298454 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm4jh\" (UniqueName: \"kubernetes.io/projected/bb8c0212-2a6d-4636-a75b-08a350f5948f-kube-api-access-cm4jh\") pod \"bb8c0212-2a6d-4636-a75b-08a350f5948f\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.298676 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-config\") pod \"bb8c0212-2a6d-4636-a75b-08a350f5948f\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.298800 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-proxy-ca-bundles\") pod \"bb8c0212-2a6d-4636-a75b-08a350f5948f\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.298997 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb8c0212-2a6d-4636-a75b-08a350f5948f-serving-cert\") pod \"bb8c0212-2a6d-4636-a75b-08a350f5948f\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.299137 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-client-ca\") pod \"bb8c0212-2a6d-4636-a75b-08a350f5948f\" (UID: \"bb8c0212-2a6d-4636-a75b-08a350f5948f\") " Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.300088 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ddba3f-71ed-49d2-a174-2d59cc716962-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"70ddba3f-71ed-49d2-a174-2d59cc716962\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.301616 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ddba3f-71ed-49d2-a174-2d59cc716962-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"70ddba3f-71ed-49d2-a174-2d59cc716962\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 06:55:18 crc kubenswrapper[4941]: E0307 06:55:18.306279 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:18.806230056 +0000 UTC m=+215.758595521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.306962 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-client-ca" (OuterVolumeSpecName: "client-ca") pod "bb8c0212-2a6d-4636-a75b-08a350f5948f" (UID: "bb8c0212-2a6d-4636-a75b-08a350f5948f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.310562 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bb8c0212-2a6d-4636-a75b-08a350f5948f" (UID: "bb8c0212-2a6d-4636-a75b-08a350f5948f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.317055 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-config" (OuterVolumeSpecName: "config") pod "bb8c0212-2a6d-4636-a75b-08a350f5948f" (UID: "bb8c0212-2a6d-4636-a75b-08a350f5948f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.328077 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.337378 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8c0212-2a6d-4636-a75b-08a350f5948f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bb8c0212-2a6d-4636-a75b-08a350f5948f" (UID: "bb8c0212-2a6d-4636-a75b-08a350f5948f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.354050 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8c0212-2a6d-4636-a75b-08a350f5948f-kube-api-access-cm4jh" (OuterVolumeSpecName: "kube-api-access-cm4jh") pod "bb8c0212-2a6d-4636-a75b-08a350f5948f" (UID: "bb8c0212-2a6d-4636-a75b-08a350f5948f"). InnerVolumeSpecName "kube-api-access-cm4jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.354428 4941 generic.go:334] "Generic (PLEG): container finished" podID="86719fee-4b62-4f53-958e-9e87f56a9062" containerID="6dc60f6de787d1ff1aa2dd430fa3f151dca6fb0653c81de6ddafb1bb943b2b5c" exitCode=0 Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.355809 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jg549" event={"ID":"86719fee-4b62-4f53-958e-9e87f56a9062","Type":"ContainerDied","Data":"6dc60f6de787d1ff1aa2dd430fa3f151dca6fb0653c81de6ddafb1bb943b2b5c"} Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.355860 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jg549" event={"ID":"86719fee-4b62-4f53-958e-9e87f56a9062","Type":"ContainerStarted","Data":"ec4f39fc91b5ef4a17e019bb59dc79248af52881668f6be434bea4bb8b658d76"} Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.367733 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66dd75d944-rw77d"] Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.372635 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.375370 4941 generic.go:334] "Generic (PLEG): container finished" podID="bdb71b40-ad9b-405b-a178-158109d65a92" containerID="ad2c70584d1e673c0a880b6d03c0354c9189453946e60a5dfb6e744f65c96d77" exitCode=0 Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.375481 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qx5dk" event={"ID":"bdb71b40-ad9b-405b-a178-158109d65a92","Type":"ContainerDied","Data":"ad2c70584d1e673c0a880b6d03c0354c9189453946e60a5dfb6e744f65c96d77"} Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.376359 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66dd75d944-rw77d"] Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.385627 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v5x2"] Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.387709 4941 generic.go:334] "Generic (PLEG): container finished" podID="bb8c0212-2a6d-4636-a75b-08a350f5948f" containerID="c5b7d30f1a7540962614f7c1e8e985aadca3abca5d22e20cecf3923be1c81d26" exitCode=0 Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.387797 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" event={"ID":"bb8c0212-2a6d-4636-a75b-08a350f5948f","Type":"ContainerDied","Data":"c5b7d30f1a7540962614f7c1e8e985aadca3abca5d22e20cecf3923be1c81d26"} Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.387830 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" event={"ID":"bb8c0212-2a6d-4636-a75b-08a350f5948f","Type":"ContainerDied","Data":"0fa92f36965ccbe86d8063d51146e42efa08635dd7940af9b0e68971e9995679"} Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.387851 4941 scope.go:117] "RemoveContainer" containerID="c5b7d30f1a7540962614f7c1e8e985aadca3abca5d22e20cecf3923be1c81d26" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.388058 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v76pm" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.399487 4941 generic.go:334] "Generic (PLEG): container finished" podID="805b56ac-66fd-4704-adb1-f3968f17f835" containerID="9c04ce9ef1572b01e35c46419ecc2059217c92a09940fe2cae1a259cb41f8ccc" exitCode=0 Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.400950 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4zqp" event={"ID":"805b56ac-66fd-4704-adb1-f3968f17f835","Type":"ContainerDied","Data":"9c04ce9ef1572b01e35c46419ecc2059217c92a09940fe2cae1a259cb41f8ccc"} Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.403521 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" podUID="fdca2db2-c710-4685-9033-fbbe40f73076" containerName="route-controller-manager" containerID="cri-o://68354b85f01ee6b9096807eb08a8a31362a8f7e86595f25e895280aceaea3f64" gracePeriod=30 Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.404345 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.405395 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ddba3f-71ed-49d2-a174-2d59cc716962-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"70ddba3f-71ed-49d2-a174-2d59cc716962\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.405676 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ddba3f-71ed-49d2-a174-2d59cc716962-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"70ddba3f-71ed-49d2-a174-2d59cc716962\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.408062 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb8c0212-2a6d-4636-a75b-08a350f5948f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.408130 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ddba3f-71ed-49d2-a174-2d59cc716962-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"70ddba3f-71ed-49d2-a174-2d59cc716962\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 06:55:18 crc kubenswrapper[4941]: E0307 06:55:18.408433 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:18.908418573 +0000 UTC m=+215.860784038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.408477 4941 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.408731 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm4jh\" (UniqueName: \"kubernetes.io/projected/bb8c0212-2a6d-4636-a75b-08a350f5948f-kube-api-access-cm4jh\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.408748 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.408759 4941 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb8c0212-2a6d-4636-a75b-08a350f5948f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.444449 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.455977 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ddba3f-71ed-49d2-a174-2d59cc716962-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"70ddba3f-71ed-49d2-a174-2d59cc716962\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.460313 4941 scope.go:117] "RemoveContainer" containerID="c5b7d30f1a7540962614f7c1e8e985aadca3abca5d22e20cecf3923be1c81d26" Mar 07 06:55:18 crc kubenswrapper[4941]: E0307 06:55:18.461198 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5b7d30f1a7540962614f7c1e8e985aadca3abca5d22e20cecf3923be1c81d26\": container with ID starting with c5b7d30f1a7540962614f7c1e8e985aadca3abca5d22e20cecf3923be1c81d26 not found: ID does not exist" containerID="c5b7d30f1a7540962614f7c1e8e985aadca3abca5d22e20cecf3923be1c81d26" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.461255 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b7d30f1a7540962614f7c1e8e985aadca3abca5d22e20cecf3923be1c81d26"} err="failed to get container status \"c5b7d30f1a7540962614f7c1e8e985aadca3abca5d22e20cecf3923be1c81d26\": rpc error: code = NotFound desc = could not find container \"c5b7d30f1a7540962614f7c1e8e985aadca3abca5d22e20cecf3923be1c81d26\": container with ID starting with c5b7d30f1a7540962614f7c1e8e985aadca3abca5d22e20cecf3923be1c81d26 not found: ID does not exist" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.484850 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ktp7f"] Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.491038 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ktp7f"] Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.491187 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.493439 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.510759 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:18 crc kubenswrapper[4941]: E0307 06:55:18.513173 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:19.013134948 +0000 UTC m=+215.965500413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.515532 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-client-ca\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.515713 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-config\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.515832 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.515957 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq6hx\" (UniqueName: \"kubernetes.io/projected/e166d976-6595-48f1-be9a-7c5b64567366-kube-api-access-vq6hx\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.516215 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-proxy-ca-bundles\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.516311 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e166d976-6595-48f1-be9a-7c5b64567366-serving-cert\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: E0307 06:55:18.522273 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:19.022254752 +0000 UTC m=+215.974620217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.611425 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.617128 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:18 crc kubenswrapper[4941]: E0307 06:55:18.619350 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:19.117378591 +0000 UTC m=+216.069744056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.619397 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq6hx\" (UniqueName: \"kubernetes.io/projected/e166d976-6595-48f1-be9a-7c5b64567366-kube-api-access-vq6hx\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.619473 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d003569d-8946-47e7-adf2-5148ca8de944-catalog-content\") pod \"redhat-operators-ktp7f\" (UID: \"d003569d-8946-47e7-adf2-5148ca8de944\") " pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.619547 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-proxy-ca-bundles\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.619669 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e166d976-6595-48f1-be9a-7c5b64567366-serving-cert\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.619702 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-client-ca\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.619734 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-config\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.619757 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5lmp\" (UniqueName: \"kubernetes.io/projected/d003569d-8946-47e7-adf2-5148ca8de944-kube-api-access-r5lmp\") pod \"redhat-operators-ktp7f\" (UID: \"d003569d-8946-47e7-adf2-5148ca8de944\") " pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.619828 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.619851 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d003569d-8946-47e7-adf2-5148ca8de944-utilities\") pod \"redhat-operators-ktp7f\" (UID: \"d003569d-8946-47e7-adf2-5148ca8de944\") " pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:55:18 crc kubenswrapper[4941]: E0307 06:55:18.624203 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:19.124176791 +0000 UTC m=+216.076542456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.628951 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-client-ca\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.644649 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e166d976-6595-48f1-be9a-7c5b64567366-serving-cert\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.645022 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-proxy-ca-bundles\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.646827 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-config\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.669902 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq6hx\" (UniqueName: \"kubernetes.io/projected/e166d976-6595-48f1-be9a-7c5b64567366-kube-api-access-vq6hx\") pod \"controller-manager-66dd75d944-rw77d\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.680703 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v76pm"] Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.685136 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v76pm"] Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.716969 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.722287 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.722653 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d003569d-8946-47e7-adf2-5148ca8de944-catalog-content\") pod \"redhat-operators-ktp7f\" (UID: \"d003569d-8946-47e7-adf2-5148ca8de944\") " pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.722748 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5lmp\" (UniqueName: \"kubernetes.io/projected/d003569d-8946-47e7-adf2-5148ca8de944-kube-api-access-r5lmp\") pod \"redhat-operators-ktp7f\" (UID: \"d003569d-8946-47e7-adf2-5148ca8de944\") " pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.722785 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d003569d-8946-47e7-adf2-5148ca8de944-utilities\") pod \"redhat-operators-ktp7f\" (UID: \"d003569d-8946-47e7-adf2-5148ca8de944\") " pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.723390 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d003569d-8946-47e7-adf2-5148ca8de944-utilities\") pod \"redhat-operators-ktp7f\" (UID: \"d003569d-8946-47e7-adf2-5148ca8de944\") " pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:55:18 crc kubenswrapper[4941]: E0307 06:55:18.723513 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:19.223471676 +0000 UTC m=+216.175837131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.724549 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d003569d-8946-47e7-adf2-5148ca8de944-catalog-content\") pod \"redhat-operators-ktp7f\" (UID: \"d003569d-8946-47e7-adf2-5148ca8de944\") " pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.774824 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5lmp\" (UniqueName: \"kubernetes.io/projected/d003569d-8946-47e7-adf2-5148ca8de944-kube-api-access-r5lmp\") pod \"redhat-operators-ktp7f\" (UID: \"d003569d-8946-47e7-adf2-5148ca8de944\") " pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.815909 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g4dmh"] Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.822337 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.841090 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.841453 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g4dmh"] Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.841537 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36212ca9-755e-4104-a203-7c136afbfca9-catalog-content\") pod \"redhat-operators-g4dmh\" (UID: \"36212ca9-755e-4104-a203-7c136afbfca9\") " pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.841623 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.841774 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k5mh\" (UniqueName: \"kubernetes.io/projected/36212ca9-755e-4104-a203-7c136afbfca9-kube-api-access-2k5mh\") pod \"redhat-operators-g4dmh\" (UID: \"36212ca9-755e-4104-a203-7c136afbfca9\") " pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.841875 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36212ca9-755e-4104-a203-7c136afbfca9-utilities\") pod \"redhat-operators-g4dmh\" (UID: \"36212ca9-755e-4104-a203-7c136afbfca9\") " pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:55:18 crc kubenswrapper[4941]: E0307 06:55:18.842130 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:19.342111451 +0000 UTC m=+216.294477116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.947934 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.948256 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36212ca9-755e-4104-a203-7c136afbfca9-utilities\") pod \"redhat-operators-g4dmh\" (UID: \"36212ca9-755e-4104-a203-7c136afbfca9\") " pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.948383 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36212ca9-755e-4104-a203-7c136afbfca9-catalog-content\") pod \"redhat-operators-g4dmh\" (UID: \"36212ca9-755e-4104-a203-7c136afbfca9\") " pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.948468 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k5mh\" (UniqueName: \"kubernetes.io/projected/36212ca9-755e-4104-a203-7c136afbfca9-kube-api-access-2k5mh\") pod \"redhat-operators-g4dmh\" (UID: \"36212ca9-755e-4104-a203-7c136afbfca9\") " pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:55:18 crc kubenswrapper[4941]: E0307 06:55:18.949006 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:19.448986247 +0000 UTC m=+216.401351712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.950085 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36212ca9-755e-4104-a203-7c136afbfca9-catalog-content\") pod \"redhat-operators-g4dmh\" (UID: \"36212ca9-755e-4104-a203-7c136afbfca9\") " pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.950205 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36212ca9-755e-4104-a203-7c136afbfca9-utilities\") pod \"redhat-operators-g4dmh\" (UID: \"36212ca9-755e-4104-a203-7c136afbfca9\") " pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.953639 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:18 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:18 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:18 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:18 crc kubenswrapper[4941]: I0307 06:55:18.953703 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:18.990550 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k5mh\" (UniqueName: \"kubernetes.io/projected/36212ca9-755e-4104-a203-7c136afbfca9-kube-api-access-2k5mh\") pod \"redhat-operators-g4dmh\" (UID: \"36212ca9-755e-4104-a203-7c136afbfca9\") " pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.012823 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q9fpr"] Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.055488 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtxk7"] Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.059953 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:19 crc kubenswrapper[4941]: E0307 06:55:19.060353 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:19.560336368 +0000 UTC m=+216.512701833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:19 crc kubenswrapper[4941]: W0307 06:55:19.113158 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80030e60_caa3_4aad_8b00_10f5143d9243.slice/crio-680d4ce13aa5035ff1938755867484bf821963f9211a361b3cdbe03bda431d7b WatchSource:0}: Error finding container 680d4ce13aa5035ff1938755867484bf821963f9211a361b3cdbe03bda431d7b: Status 404 returned error can't find the container with id 680d4ce13aa5035ff1938755867484bf821963f9211a361b3cdbe03bda431d7b Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.172626 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:19 crc kubenswrapper[4941]: E0307 06:55:19.179087 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:19.679040144 +0000 UTC m=+216.631405609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.184133 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.276079 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:19 crc kubenswrapper[4941]: E0307 06:55:19.276588 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:19.77655304 +0000 UTC m=+216.728918505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.357066 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.375885 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.378153 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:19 crc kubenswrapper[4941]: E0307 06:55:19.378942 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:19.878914331 +0000 UTC m=+216.831279786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:19 crc kubenswrapper[4941]: W0307 06:55:19.420909 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod70ddba3f_71ed_49d2_a174_2d59cc716962.slice/crio-5fd3ce026edc781927867ffafe8b35a3d2da051b89d22ca930d2badf67c41145 WatchSource:0}: Error finding container 5fd3ce026edc781927867ffafe8b35a3d2da051b89d22ca930d2badf67c41145: Status 404 returned error can't find the container with id 5fd3ce026edc781927867ffafe8b35a3d2da051b89d22ca930d2badf67c41145 Mar 07 06:55:19 crc kubenswrapper[4941]: W0307 06:55:19.431194 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-3ca5de8efa03b3268fdf1fa915ae27d9a0e6f074b2b2f1e6b7fdcab8f20a3a1c WatchSource:0}: Error finding container 3ca5de8efa03b3268fdf1fa915ae27d9a0e6f074b2b2f1e6b7fdcab8f20a3a1c: Status 404 returned error can't find the container with id 3ca5de8efa03b3268fdf1fa915ae27d9a0e6f074b2b2f1e6b7fdcab8f20a3a1c Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.435727 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3be4062412c296d395ebc82c0885a67477221167f19280d0b43b7a1b8c1baf34"} Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.438772 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtxk7" event={"ID":"37622fc0-c5dc-4e0d-848a-214bce293f7f","Type":"ContainerStarted","Data":"7d9a7372a18ca6df94bb77a20939c88d162938d62478d6229c493fa790ffbc7c"} Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.444698 4941 generic.go:334] "Generic (PLEG): container finished" podID="fdca2db2-c710-4685-9033-fbbe40f73076" containerID="68354b85f01ee6b9096807eb08a8a31362a8f7e86595f25e895280aceaea3f64" exitCode=0 Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.444789 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" event={"ID":"fdca2db2-c710-4685-9033-fbbe40f73076","Type":"ContainerDied","Data":"68354b85f01ee6b9096807eb08a8a31362a8f7e86595f25e895280aceaea3f64"} Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.444843 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" event={"ID":"fdca2db2-c710-4685-9033-fbbe40f73076","Type":"ContainerDied","Data":"e1654ddeb4f943c9954fd346aac51e4b052eadfdb00a8bb7490e3aab97da5c80"} Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.444863 4941 scope.go:117] "RemoveContainer" containerID="68354b85f01ee6b9096807eb08a8a31362a8f7e86595f25e895280aceaea3f64" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.445005 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.471679 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" event={"ID":"80030e60-caa3-4aad-8b00-10f5143d9243","Type":"ContainerStarted","Data":"680d4ce13aa5035ff1938755867484bf821963f9211a361b3cdbe03bda431d7b"} Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.474553 4941 generic.go:334] "Generic (PLEG): container finished" podID="a10e3708-a476-4698-aa8d-ba99a795524a" containerID="756e5b0af3038eb53f3c1b907a19edb61aa6b2a46afc769da1b7fa1daeb32ddf" exitCode=0 Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.474639 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v5x2" event={"ID":"a10e3708-a476-4698-aa8d-ba99a795524a","Type":"ContainerDied","Data":"756e5b0af3038eb53f3c1b907a19edb61aa6b2a46afc769da1b7fa1daeb32ddf"} Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.474717 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v5x2" event={"ID":"a10e3708-a476-4698-aa8d-ba99a795524a","Type":"ContainerStarted","Data":"2c19b6640fd50716e9f0f5c7f5094ff34bad0f1bf25bfc08da9aa9fec332d040"} Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.479104 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdca2db2-c710-4685-9033-fbbe40f73076-serving-cert\") pod \"fdca2db2-c710-4685-9033-fbbe40f73076\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.479153 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdca2db2-c710-4685-9033-fbbe40f73076-config\") pod \"fdca2db2-c710-4685-9033-fbbe40f73076\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.479345 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdca2db2-c710-4685-9033-fbbe40f73076-client-ca\") pod \"fdca2db2-c710-4685-9033-fbbe40f73076\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.479450 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmqb5\" (UniqueName: \"kubernetes.io/projected/fdca2db2-c710-4685-9033-fbbe40f73076-kube-api-access-pmqb5\") pod \"fdca2db2-c710-4685-9033-fbbe40f73076\" (UID: \"fdca2db2-c710-4685-9033-fbbe40f73076\") " Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.479670 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:19 crc kubenswrapper[4941]: E0307 06:55:19.480130 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:19.98011483 +0000 UTC m=+216.932480295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.480715 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdca2db2-c710-4685-9033-fbbe40f73076-client-ca" (OuterVolumeSpecName: "client-ca") pod "fdca2db2-c710-4685-9033-fbbe40f73076" (UID: "fdca2db2-c710-4685-9033-fbbe40f73076"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.481561 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdca2db2-c710-4685-9033-fbbe40f73076-config" (OuterVolumeSpecName: "config") pod "fdca2db2-c710-4685-9033-fbbe40f73076" (UID: "fdca2db2-c710-4685-9033-fbbe40f73076"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.495349 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdca2db2-c710-4685-9033-fbbe40f73076-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fdca2db2-c710-4685-9033-fbbe40f73076" (UID: "fdca2db2-c710-4685-9033-fbbe40f73076"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.502585 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdca2db2-c710-4685-9033-fbbe40f73076-kube-api-access-pmqb5" (OuterVolumeSpecName: "kube-api-access-pmqb5") pod "fdca2db2-c710-4685-9033-fbbe40f73076" (UID: "fdca2db2-c710-4685-9033-fbbe40f73076"). InnerVolumeSpecName "kube-api-access-pmqb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:55:19 crc kubenswrapper[4941]: W0307 06:55:19.514947 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-8366db1c0b141e7ad5c6c837e0a5b9df30cc6c918b822b481fffb548d9a2cba3 WatchSource:0}: Error finding container 8366db1c0b141e7ad5c6c837e0a5b9df30cc6c918b822b481fffb548d9a2cba3: Status 404 returned error can't find the container with id 8366db1c0b141e7ad5c6c837e0a5b9df30cc6c918b822b481fffb548d9a2cba3 Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.555159 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66dd75d944-rw77d"] Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.584385 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.584848 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmqb5\" (UniqueName: \"kubernetes.io/projected/fdca2db2-c710-4685-9033-fbbe40f73076-kube-api-access-pmqb5\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.584867 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdca2db2-c710-4685-9033-fbbe40f73076-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.584880 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdca2db2-c710-4685-9033-fbbe40f73076-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.584891 4941 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdca2db2-c710-4685-9033-fbbe40f73076-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:19 crc kubenswrapper[4941]: E0307 06:55:19.585821 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:20.085801793 +0000 UTC m=+217.038167258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.638476 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ktp7f"] Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.652250 4941 scope.go:117] "RemoveContainer" containerID="68354b85f01ee6b9096807eb08a8a31362a8f7e86595f25e895280aceaea3f64" Mar 07 06:55:19 crc kubenswrapper[4941]: W0307 06:55:19.655681 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode166d976_6595_48f1_be9a_7c5b64567366.slice/crio-71891a3fb8c033074983e3a0147fa29d34032404617d023a61be4e4e0afff3ec WatchSource:0}: Error finding container 71891a3fb8c033074983e3a0147fa29d34032404617d023a61be4e4e0afff3ec: Status 404 returned error can't find the container with id 71891a3fb8c033074983e3a0147fa29d34032404617d023a61be4e4e0afff3ec Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.686699 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:19 crc kubenswrapper[4941]: E0307 06:55:19.687091 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:20.187077054 +0000 UTC m=+217.139442519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:19 crc kubenswrapper[4941]: E0307 06:55:19.708465 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68354b85f01ee6b9096807eb08a8a31362a8f7e86595f25e895280aceaea3f64\": container with ID starting with 68354b85f01ee6b9096807eb08a8a31362a8f7e86595f25e895280aceaea3f64 not found: ID does not exist" containerID="68354b85f01ee6b9096807eb08a8a31362a8f7e86595f25e895280aceaea3f64" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.708519 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68354b85f01ee6b9096807eb08a8a31362a8f7e86595f25e895280aceaea3f64"} err="failed to get container status \"68354b85f01ee6b9096807eb08a8a31362a8f7e86595f25e895280aceaea3f64\": rpc error: code = NotFound desc = could not find container \"68354b85f01ee6b9096807eb08a8a31362a8f7e86595f25e895280aceaea3f64\": container with ID starting with 68354b85f01ee6b9096807eb08a8a31362a8f7e86595f25e895280aceaea3f64 not found: ID does not exist" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.749322 4941 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.787845 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:19 crc kubenswrapper[4941]: E0307 06:55:19.788033 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:20.288007655 +0000 UTC m=+217.240373120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.788309 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:19 crc kubenswrapper[4941]: E0307 06:55:19.788847 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:20.288838578 +0000 UTC m=+217.241204043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.829390 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn"] Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.833333 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zm7wn"] Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.856227 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.889547 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-secret-volume\") pod \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\" (UID: \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\") " Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.889676 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-config-volume\") pod \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\" (UID: \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\") " Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.889888 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.889952 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2p2v\" (UniqueName: \"kubernetes.io/projected/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-kube-api-access-z2p2v\") pod \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\" (UID: \"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb\") " Mar 07 06:55:19 crc kubenswrapper[4941]: E0307 06:55:19.891159 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:20.391118767 +0000 UTC m=+217.343484232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.891558 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-config-volume" (OuterVolumeSpecName: "config-volume") pod "5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb" (UID: "5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.900387 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-kube-api-access-z2p2v" (OuterVolumeSpecName: "kube-api-access-z2p2v") pod "5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb" (UID: "5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb"). InnerVolumeSpecName "kube-api-access-z2p2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.900648 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb" (UID: "5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.916598 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.916830 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.933466 4941 patch_prober.go:28] interesting pod/apiserver-76f77b778f-j2bnz container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 07 06:55:19 crc kubenswrapper[4941]: [+]log ok Mar 07 06:55:19 crc kubenswrapper[4941]: [+]etcd ok Mar 07 06:55:19 crc kubenswrapper[4941]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 07 06:55:19 crc kubenswrapper[4941]: [+]poststarthook/generic-apiserver-start-informers ok Mar 07 06:55:19 crc kubenswrapper[4941]: [+]poststarthook/max-in-flight-filter ok Mar 07 06:55:19 crc kubenswrapper[4941]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 07 06:55:19 crc kubenswrapper[4941]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 07 06:55:19 crc kubenswrapper[4941]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 07 06:55:19 crc kubenswrapper[4941]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 07 06:55:19 crc kubenswrapper[4941]: [+]poststarthook/project.openshift.io-projectcache ok Mar 07 06:55:19 crc kubenswrapper[4941]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 07 06:55:19 crc kubenswrapper[4941]: [+]poststarthook/openshift.io-startinformers ok Mar 07 06:55:19 crc kubenswrapper[4941]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 07 06:55:19 crc kubenswrapper[4941]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 07 06:55:19 crc kubenswrapper[4941]: livez check failed Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.933560 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" podUID="a5d1706b-179e-4ffd-a2af-e62d05e1e36d" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.946534 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:19 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:19 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:19 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.946611 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.972556 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8c0212-2a6d-4636-a75b-08a350f5948f" path="/var/lib/kubelet/pods/bb8c0212-2a6d-4636-a75b-08a350f5948f/volumes" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.973666 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdca2db2-c710-4685-9033-fbbe40f73076" path="/var/lib/kubelet/pods/fdca2db2-c710-4685-9033-fbbe40f73076/volumes" Mar 07 06:55:19 crc kubenswrapper[4941]: W0307 06:55:19.974441 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36212ca9_755e_4104_a203_7c136afbfca9.slice/crio-ff630edcad373f9613081a2a57da9c3f0cbe2b615c43a5ae01227878d85eefca WatchSource:0}: Error finding container ff630edcad373f9613081a2a57da9c3f0cbe2b615c43a5ae01227878d85eefca: Status 404 returned error can't find the container with id ff630edcad373f9613081a2a57da9c3f0cbe2b615c43a5ae01227878d85eefca Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.974758 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g4dmh"] Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.992208 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.992494 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2p2v\" (UniqueName: \"kubernetes.io/projected/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-kube-api-access-z2p2v\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.992513 4941 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:19 crc kubenswrapper[4941]: I0307 06:55:19.992524 4941 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:19 crc kubenswrapper[4941]: E0307 06:55:19.993878 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:20.493864829 +0000 UTC m=+217.446230374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.048700 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.048745 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.050834 4941 patch_prober.go:28] interesting pod/console-f9d7485db-nwzjs container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.050884 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nwzjs" podUID="46da50cb-1038-4289-be6d-e5f3b4c70ab3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.103477 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:20 crc kubenswrapper[4941]: E0307 06:55:20.103785 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:20.603736819 +0000 UTC m=+217.556102284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.103965 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:20 crc kubenswrapper[4941]: E0307 06:55:20.106213 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 06:55:20.606197887 +0000 UTC m=+217.558563352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wpb7c" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.205579 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:20 crc kubenswrapper[4941]: E0307 06:55:20.205964 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 06:55:20.705941395 +0000 UTC m=+217.658306860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.206052 4941 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-07T06:55:19.749341958Z","Handler":null,"Name":""} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.218652 4941 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.218701 4941 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.299374 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cp52v" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.307534 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.366233 4941 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.366321 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.411001 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wpb7c\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.488368 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ea6b89d8f5ea6a0a159dcf3c5a7d4a21bd097fa36fc1d9749427db985c31be9e"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.488472 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8366db1c0b141e7ad5c6c837e0a5b9df30cc6c918b822b481fffb548d9a2cba3"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.489347 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.490581 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.493588 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" event={"ID":"80030e60-caa3-4aad-8b00-10f5143d9243","Type":"ContainerStarted","Data":"04e6dad53f410408ffb87ed94f8c5ef402c3148333dd15f1e36f29daec53550b"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.499524 4941 ???:1] "http: TLS handshake error from 192.168.126.11:46286: no serving certificate available for the kubelet" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.503270 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" event={"ID":"e166d976-6595-48f1-be9a-7c5b64567366","Type":"ContainerStarted","Data":"68cac33d325e29b9d278f68b552915eccc37d52f282843f15c99e2ef0135a333"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.503307 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" event={"ID":"e166d976-6595-48f1-be9a-7c5b64567366","Type":"ContainerStarted","Data":"71891a3fb8c033074983e3a0147fa29d34032404617d023a61be4e4e0afff3ec"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.503323 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.514549 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.518626 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.521730 4941 generic.go:334] "Generic (PLEG): container finished" podID="d003569d-8946-47e7-adf2-5148ca8de944" containerID="d2d3c310234855d9e00e2325c2f877b0fb93bea45fc23a8a441a51c53dddc621" exitCode=0 Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.521887 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktp7f" event={"ID":"d003569d-8946-47e7-adf2-5148ca8de944","Type":"ContainerDied","Data":"d2d3c310234855d9e00e2325c2f877b0fb93bea45fc23a8a441a51c53dddc621"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.521926 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktp7f" event={"ID":"d003569d-8946-47e7-adf2-5148ca8de944","Type":"ContainerStarted","Data":"317b1eb31db1baeb331bfbf3988ca6ad93caa39b7bd3580dd19ca28e7a532202"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.529023 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" podStartSLOduration=4.529003893 podStartE2EDuration="4.529003893s" podCreationTimestamp="2026-03-07 06:55:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:20.528219751 +0000 UTC m=+217.480585216" watchObservedRunningTime="2026-03-07 06:55:20.529003893 +0000 UTC m=+217.481369348" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.531052 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vsntc" event={"ID":"b6ad282e-3374-4a34-8956-252c6196274d","Type":"ContainerStarted","Data":"f6e3f5fff575f9bd6c6129c01a10de0767ff0b31d4c98c0ccdfdfb3e35af3ce3"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.535689 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" event={"ID":"5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb","Type":"ContainerDied","Data":"79262c3b7c55439b088428bb281acb42da3a85dc921dfac56b12e0e350fe0b6a"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.535727 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79262c3b7c55439b088428bb281acb42da3a85dc921dfac56b12e0e350fe0b6a" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.535799 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.543782 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.547187 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g4dmh" event={"ID":"36212ca9-755e-4104-a203-7c136afbfca9","Type":"ContainerStarted","Data":"f212e51eddef03cf072afe8f79363a25de336b0313cab1ee4c707559d7144f03"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.547232 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g4dmh" event={"ID":"36212ca9-755e-4104-a203-7c136afbfca9","Type":"ContainerStarted","Data":"ff630edcad373f9613081a2a57da9c3f0cbe2b615c43a5ae01227878d85eefca"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.554862 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"42f9abdf1803f41704c6829402802b165f9d42f7799341768727cb5dd2be6124"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.554930 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3ca5de8efa03b3268fdf1fa915ae27d9a0e6f074b2b2f1e6b7fdcab8f20a3a1c"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.589245 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"70ddba3f-71ed-49d2-a174-2d59cc716962","Type":"ContainerStarted","Data":"5fd3ce026edc781927867ffafe8b35a3d2da051b89d22ca930d2badf67c41145"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.590891 4941 generic.go:334] "Generic (PLEG): container finished" podID="37622fc0-c5dc-4e0d-848a-214bce293f7f" containerID="1177b5251f7bbad1bb8f8bb263363d08461e953fe922334fd85ea409f4423561" exitCode=0 Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.590941 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtxk7" event={"ID":"37622fc0-c5dc-4e0d-848a-214bce293f7f","Type":"ContainerDied","Data":"1177b5251f7bbad1bb8f8bb263363d08461e953fe922334fd85ea409f4423561"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.595690 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a74544aed966d824e62460f652fbd002fe0cdc7613e5d1717b78412985e7610b"} Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.871865 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.872386 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.873051 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.873090 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.932079 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.932117 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.942079 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.946493 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt"] Mar 07 06:55:20 crc kubenswrapper[4941]: E0307 06:55:20.946876 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb" containerName="collect-profiles" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.946890 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb" containerName="collect-profiles" Mar 07 06:55:20 crc kubenswrapper[4941]: E0307 06:55:20.946906 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdca2db2-c710-4685-9033-fbbe40f73076" containerName="route-controller-manager" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.946912 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdca2db2-c710-4685-9033-fbbe40f73076" containerName="route-controller-manager" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.947161 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb" containerName="collect-profiles" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.947178 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdca2db2-c710-4685-9033-fbbe40f73076" containerName="route-controller-manager" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.948020 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.949430 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.951773 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:20 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:20 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:20 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.951835 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.958393 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.958701 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.958808 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.959359 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.959525 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.972720 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 06:55:20 crc kubenswrapper[4941]: I0307 06:55:20.974278 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt"] Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.043426 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-client-ca\") pod \"route-controller-manager-57c58dd795-zr2nt\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.043502 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-serving-cert\") pod \"route-controller-manager-57c58dd795-zr2nt\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.043627 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-config\") pod \"route-controller-manager-57c58dd795-zr2nt\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.043769 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8wgm\" (UniqueName: \"kubernetes.io/projected/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-kube-api-access-m8wgm\") pod \"route-controller-manager-57c58dd795-zr2nt\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.145987 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8wgm\" (UniqueName: \"kubernetes.io/projected/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-kube-api-access-m8wgm\") pod \"route-controller-manager-57c58dd795-zr2nt\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.146451 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-client-ca\") pod \"route-controller-manager-57c58dd795-zr2nt\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.146473 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-serving-cert\") pod \"route-controller-manager-57c58dd795-zr2nt\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.146511 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-config\") pod \"route-controller-manager-57c58dd795-zr2nt\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.147723 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-config\") pod \"route-controller-manager-57c58dd795-zr2nt\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.147834 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-client-ca\") pod \"route-controller-manager-57c58dd795-zr2nt\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.170016 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8wgm\" (UniqueName: \"kubernetes.io/projected/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-kube-api-access-m8wgm\") pod \"route-controller-manager-57c58dd795-zr2nt\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.184525 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-serving-cert\") pod \"route-controller-manager-57c58dd795-zr2nt\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.262057 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wpb7c"] Mar 07 06:55:21 crc kubenswrapper[4941]: W0307 06:55:21.277886 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cf41cff_9af9_423f_8e57_117983f90b7b.slice/crio-e76fda133319d6f4fd1adb769f781001de73e7fc3eb6e6a28a9c64bf8c785c65 WatchSource:0}: Error finding container e76fda133319d6f4fd1adb769f781001de73e7fc3eb6e6a28a9c64bf8c785c65: Status 404 returned error can't find the container with id e76fda133319d6f4fd1adb769f781001de73e7fc3eb6e6a28a9c64bf8c785c65 Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.284932 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.606688 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vsntc" event={"ID":"b6ad282e-3374-4a34-8956-252c6196274d","Type":"ContainerStarted","Data":"10b6dc062f4b2c31164ef4b78f0c0f1916e1e15b53e5abf8b398e6a62f8d5802"} Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.607197 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vsntc" event={"ID":"b6ad282e-3374-4a34-8956-252c6196274d","Type":"ContainerStarted","Data":"79d8b0316ff8f27e9f2c756ce10f959e3930bf749c2dafc0eef00c521f76bd1a"} Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.627330 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q9fpr" event={"ID":"80030e60-caa3-4aad-8b00-10f5143d9243","Type":"ContainerStarted","Data":"dc21bca7277082a4b30980ff066b6eb0f984ff44176169bc2bf564cdcc84fd3b"} Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.636012 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-vsntc" podStartSLOduration=14.635993704 podStartE2EDuration="14.635993704s" podCreationTimestamp="2026-03-07 06:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:21.63187196 +0000 UTC m=+218.584237425" watchObservedRunningTime="2026-03-07 06:55:21.635993704 +0000 UTC m=+218.588359169" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.643380 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" event={"ID":"0cf41cff-9af9-423f-8e57-117983f90b7b","Type":"ContainerStarted","Data":"e76fda133319d6f4fd1adb769f781001de73e7fc3eb6e6a28a9c64bf8c785c65"} Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.661249 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q9fpr" podStartSLOduration=184.661227157 podStartE2EDuration="3m4.661227157s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:21.65198279 +0000 UTC m=+218.604348255" watchObservedRunningTime="2026-03-07 06:55:21.661227157 +0000 UTC m=+218.613592622" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.676871 4941 generic.go:334] "Generic (PLEG): container finished" podID="36212ca9-755e-4104-a203-7c136afbfca9" containerID="f212e51eddef03cf072afe8f79363a25de336b0313cab1ee4c707559d7144f03" exitCode=0 Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.676978 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g4dmh" event={"ID":"36212ca9-755e-4104-a203-7c136afbfca9","Type":"ContainerDied","Data":"f212e51eddef03cf072afe8f79363a25de336b0313cab1ee4c707559d7144f03"} Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.700035 4941 generic.go:334] "Generic (PLEG): container finished" podID="70ddba3f-71ed-49d2-a174-2d59cc716962" containerID="0516b340b1058685fc068669d5ac29b9ec53ce02dfa5a39d790112dff704c18e" exitCode=0 Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.700750 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"70ddba3f-71ed-49d2-a174-2d59cc716962","Type":"ContainerDied","Data":"0516b340b1058685fc068669d5ac29b9ec53ce02dfa5a39d790112dff704c18e"} Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.700836 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt"] Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.706166 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8mfp9" Mar 07 06:55:21 crc kubenswrapper[4941]: W0307 06:55:21.729454 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d3a15ee_eb4d_4c17_8e55_4c6323437de5.slice/crio-64593f98da08597dcdf7e5bf04ad064cc4483af0ca1a0d76d2cc2528054ce7a9 WatchSource:0}: Error finding container 64593f98da08597dcdf7e5bf04ad064cc4483af0ca1a0d76d2cc2528054ce7a9: Status 404 returned error can't find the container with id 64593f98da08597dcdf7e5bf04ad064cc4483af0ca1a0d76d2cc2528054ce7a9 Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.946454 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:21 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:21 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:21 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.946952 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:21 crc kubenswrapper[4941]: I0307 06:55:21.990022 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.718109 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" event={"ID":"6d3a15ee-eb4d-4c17-8e55-4c6323437de5","Type":"ContainerStarted","Data":"ee43eb1a4c3bc7552b813e059be0f313e9ef224d9b01ad892ad7e2d164e22781"} Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.718537 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" event={"ID":"6d3a15ee-eb4d-4c17-8e55-4c6323437de5","Type":"ContainerStarted","Data":"64593f98da08597dcdf7e5bf04ad064cc4483af0ca1a0d76d2cc2528054ce7a9"} Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.718971 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.724742 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.727343 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" event={"ID":"0cf41cff-9af9-423f-8e57-117983f90b7b","Type":"ContainerStarted","Data":"9186eac316e1b486a99840556c045eca05511c648142aab7f63e6c4c5893bf15"} Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.727375 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.740993 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" podStartSLOduration=6.740967688 podStartE2EDuration="6.740967688s" podCreationTimestamp="2026-03-07 06:55:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:22.739995761 +0000 UTC m=+219.692361246" watchObservedRunningTime="2026-03-07 06:55:22.740967688 +0000 UTC m=+219.693333153" Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.766150 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" podStartSLOduration=185.765329767 podStartE2EDuration="3m5.765329767s" podCreationTimestamp="2026-03-07 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:55:22.763872396 +0000 UTC m=+219.716237871" watchObservedRunningTime="2026-03-07 06:55:22.765329767 +0000 UTC m=+219.717695232" Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.845554 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.848162 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.852115 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.852590 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.857252 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.951175 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:22 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:22 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:22 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.951772 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.985339 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54fec849-0c4e-4d9b-bc6a-f11742474018-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"54fec849-0c4e-4d9b-bc6a-f11742474018\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 06:55:22 crc kubenswrapper[4941]: I0307 06:55:22.985601 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54fec849-0c4e-4d9b-bc6a-f11742474018-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"54fec849-0c4e-4d9b-bc6a-f11742474018\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.069332 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.089439 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54fec849-0c4e-4d9b-bc6a-f11742474018-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"54fec849-0c4e-4d9b-bc6a-f11742474018\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.089554 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54fec849-0c4e-4d9b-bc6a-f11742474018-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"54fec849-0c4e-4d9b-bc6a-f11742474018\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.089638 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54fec849-0c4e-4d9b-bc6a-f11742474018-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"54fec849-0c4e-4d9b-bc6a-f11742474018\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.123462 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54fec849-0c4e-4d9b-bc6a-f11742474018-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"54fec849-0c4e-4d9b-bc6a-f11742474018\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.190857 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ddba3f-71ed-49d2-a174-2d59cc716962-kube-api-access\") pod \"70ddba3f-71ed-49d2-a174-2d59cc716962\" (UID: \"70ddba3f-71ed-49d2-a174-2d59cc716962\") " Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.191697 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ddba3f-71ed-49d2-a174-2d59cc716962-kubelet-dir\") pod \"70ddba3f-71ed-49d2-a174-2d59cc716962\" (UID: \"70ddba3f-71ed-49d2-a174-2d59cc716962\") " Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.191868 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70ddba3f-71ed-49d2-a174-2d59cc716962-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "70ddba3f-71ed-49d2-a174-2d59cc716962" (UID: "70ddba3f-71ed-49d2-a174-2d59cc716962"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.192057 4941 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ddba3f-71ed-49d2-a174-2d59cc716962-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.192801 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.208875 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ddba3f-71ed-49d2-a174-2d59cc716962-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "70ddba3f-71ed-49d2-a174-2d59cc716962" (UID: "70ddba3f-71ed-49d2-a174-2d59cc716962"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.293105 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ddba3f-71ed-49d2-a174-2d59cc716962-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.756146 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.756545 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"70ddba3f-71ed-49d2-a174-2d59cc716962","Type":"ContainerDied","Data":"5fd3ce026edc781927867ffafe8b35a3d2da051b89d22ca930d2badf67c41145"} Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.756603 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fd3ce026edc781927867ffafe8b35a3d2da051b89d22ca930d2badf67c41145" Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.805585 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.944660 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:23 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:23 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:23 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:23 crc kubenswrapper[4941]: I0307 06:55:23.944729 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:24 crc kubenswrapper[4941]: I0307 06:55:24.366891 4941 ???:1] "http: TLS handshake error from 192.168.126.11:46546: no serving certificate available for the kubelet" Mar 07 06:55:24 crc kubenswrapper[4941]: I0307 06:55:24.788191 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"54fec849-0c4e-4d9b-bc6a-f11742474018","Type":"ContainerStarted","Data":"5c248d8bba0f29a79b6f9b838ea1dcc0cec3fa6717d6f80a0c6af5cf306a66fd"} Mar 07 06:55:24 crc kubenswrapper[4941]: I0307 06:55:24.923107 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:24 crc kubenswrapper[4941]: I0307 06:55:24.928089 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-j2bnz" Mar 07 06:55:24 crc kubenswrapper[4941]: I0307 06:55:24.943494 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:24 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:24 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:24 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:24 crc kubenswrapper[4941]: I0307 06:55:24.943570 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:25 crc kubenswrapper[4941]: I0307 06:55:25.657140 4941 ???:1] "http: TLS handshake error from 192.168.126.11:46556: no serving certificate available for the kubelet" Mar 07 06:55:25 crc kubenswrapper[4941]: I0307 06:55:25.807036 4941 generic.go:334] "Generic (PLEG): container finished" podID="54fec849-0c4e-4d9b-bc6a-f11742474018" containerID="65f67a77e187cbc2ad8ddfc886fe75c80e51c2816d139a63d8d7ba3e0d221e4c" exitCode=0 Mar 07 06:55:25 crc kubenswrapper[4941]: I0307 06:55:25.807149 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"54fec849-0c4e-4d9b-bc6a-f11742474018","Type":"ContainerDied","Data":"65f67a77e187cbc2ad8ddfc886fe75c80e51c2816d139a63d8d7ba3e0d221e4c"} Mar 07 06:55:25 crc kubenswrapper[4941]: I0307 06:55:25.943345 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:25 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:25 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:25 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:25 crc kubenswrapper[4941]: I0307 06:55:25.943527 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:26 crc kubenswrapper[4941]: I0307 06:55:26.084973 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-m9vbd" Mar 07 06:55:26 crc kubenswrapper[4941]: I0307 06:55:26.943252 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:26 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:26 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:26 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:26 crc kubenswrapper[4941]: I0307 06:55:26.943712 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:27 crc kubenswrapper[4941]: I0307 06:55:27.943612 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:27 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:27 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:27 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:27 crc kubenswrapper[4941]: I0307 06:55:27.943691 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:28 crc kubenswrapper[4941]: I0307 06:55:28.943085 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:28 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:28 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:28 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:28 crc kubenswrapper[4941]: I0307 06:55:28.943157 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:29 crc kubenswrapper[4941]: I0307 06:55:29.943723 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:29 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:29 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:29 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:29 crc kubenswrapper[4941]: I0307 06:55:29.944846 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:30 crc kubenswrapper[4941]: I0307 06:55:30.050056 4941 patch_prober.go:28] interesting pod/console-f9d7485db-nwzjs container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 07 06:55:30 crc kubenswrapper[4941]: I0307 06:55:30.050154 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nwzjs" podUID="46da50cb-1038-4289-be6d-e5f3b4c70ab3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 07 06:55:30 crc kubenswrapper[4941]: I0307 06:55:30.871168 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:55:30 crc kubenswrapper[4941]: I0307 06:55:30.871235 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:55:30 crc kubenswrapper[4941]: I0307 06:55:30.871304 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:55:30 crc kubenswrapper[4941]: I0307 06:55:30.872529 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:55:30 crc kubenswrapper[4941]: I0307 06:55:30.942697 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:30 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:30 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:30 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:30 crc kubenswrapper[4941]: I0307 06:55:30.942830 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:31 crc kubenswrapper[4941]: I0307 06:55:31.944297 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:31 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:31 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:31 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:31 crc kubenswrapper[4941]: I0307 06:55:31.944936 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:32 crc kubenswrapper[4941]: I0307 06:55:32.942832 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:32 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:32 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:32 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:32 crc kubenswrapper[4941]: I0307 06:55:32.942895 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:33 crc kubenswrapper[4941]: I0307 06:55:33.942454 4941 patch_prober.go:28] interesting pod/router-default-5444994796-w745p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 06:55:33 crc kubenswrapper[4941]: [-]has-synced failed: reason withheld Mar 07 06:55:33 crc kubenswrapper[4941]: [+]process-running ok Mar 07 06:55:33 crc kubenswrapper[4941]: healthz check failed Mar 07 06:55:33 crc kubenswrapper[4941]: I0307 06:55:33.942523 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w745p" podUID="bffb6535-e060-4328-8e0d-0b8bd64c656b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:55:34 crc kubenswrapper[4941]: I0307 06:55:34.644427 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66dd75d944-rw77d"] Mar 07 06:55:34 crc kubenswrapper[4941]: I0307 06:55:34.645189 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" podUID="e166d976-6595-48f1-be9a-7c5b64567366" containerName="controller-manager" containerID="cri-o://68cac33d325e29b9d278f68b552915eccc37d52f282843f15c99e2ef0135a333" gracePeriod=30 Mar 07 06:55:34 crc kubenswrapper[4941]: I0307 06:55:34.655692 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt"] Mar 07 06:55:34 crc kubenswrapper[4941]: I0307 06:55:34.655949 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" podUID="6d3a15ee-eb4d-4c17-8e55-4c6323437de5" containerName="route-controller-manager" containerID="cri-o://ee43eb1a4c3bc7552b813e059be0f313e9ef224d9b01ad892ad7e2d164e22781" gracePeriod=30 Mar 07 06:55:34 crc kubenswrapper[4941]: I0307 06:55:34.950475 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:34 crc kubenswrapper[4941]: I0307 06:55:34.956896 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-w745p" Mar 07 06:55:35 crc kubenswrapper[4941]: I0307 06:55:35.902931 4941 generic.go:334] "Generic (PLEG): container finished" podID="e166d976-6595-48f1-be9a-7c5b64567366" containerID="68cac33d325e29b9d278f68b552915eccc37d52f282843f15c99e2ef0135a333" exitCode=0 Mar 07 06:55:35 crc kubenswrapper[4941]: I0307 06:55:35.903625 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" event={"ID":"e166d976-6595-48f1-be9a-7c5b64567366","Type":"ContainerDied","Data":"68cac33d325e29b9d278f68b552915eccc37d52f282843f15c99e2ef0135a333"} Mar 07 06:55:35 crc kubenswrapper[4941]: I0307 06:55:35.906562 4941 generic.go:334] "Generic (PLEG): container finished" podID="6d3a15ee-eb4d-4c17-8e55-4c6323437de5" containerID="ee43eb1a4c3bc7552b813e059be0f313e9ef224d9b01ad892ad7e2d164e22781" exitCode=0 Mar 07 06:55:35 crc kubenswrapper[4941]: I0307 06:55:35.906832 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" event={"ID":"6d3a15ee-eb4d-4c17-8e55-4c6323437de5","Type":"ContainerDied","Data":"ee43eb1a4c3bc7552b813e059be0f313e9ef224d9b01ad892ad7e2d164e22781"} Mar 07 06:55:35 crc kubenswrapper[4941]: I0307 06:55:35.979136 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 06:55:36 crc kubenswrapper[4941]: I0307 06:55:36.168171 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54fec849-0c4e-4d9b-bc6a-f11742474018-kube-api-access\") pod \"54fec849-0c4e-4d9b-bc6a-f11742474018\" (UID: \"54fec849-0c4e-4d9b-bc6a-f11742474018\") " Mar 07 06:55:36 crc kubenswrapper[4941]: I0307 06:55:36.168279 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54fec849-0c4e-4d9b-bc6a-f11742474018-kubelet-dir\") pod \"54fec849-0c4e-4d9b-bc6a-f11742474018\" (UID: \"54fec849-0c4e-4d9b-bc6a-f11742474018\") " Mar 07 06:55:36 crc kubenswrapper[4941]: I0307 06:55:36.168522 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54fec849-0c4e-4d9b-bc6a-f11742474018-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "54fec849-0c4e-4d9b-bc6a-f11742474018" (UID: "54fec849-0c4e-4d9b-bc6a-f11742474018"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:55:36 crc kubenswrapper[4941]: I0307 06:55:36.177828 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54fec849-0c4e-4d9b-bc6a-f11742474018-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "54fec849-0c4e-4d9b-bc6a-f11742474018" (UID: "54fec849-0c4e-4d9b-bc6a-f11742474018"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:55:36 crc kubenswrapper[4941]: I0307 06:55:36.270020 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54fec849-0c4e-4d9b-bc6a-f11742474018-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:36 crc kubenswrapper[4941]: I0307 06:55:36.270068 4941 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54fec849-0c4e-4d9b-bc6a-f11742474018-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:36 crc kubenswrapper[4941]: I0307 06:55:36.916713 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"54fec849-0c4e-4d9b-bc6a-f11742474018","Type":"ContainerDied","Data":"5c248d8bba0f29a79b6f9b838ea1dcc0cec3fa6717d6f80a0c6af5cf306a66fd"} Mar 07 06:55:36 crc kubenswrapper[4941]: I0307 06:55:36.916833 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c248d8bba0f29a79b6f9b838ea1dcc0cec3fa6717d6f80a0c6af5cf306a66fd" Mar 07 06:55:36 crc kubenswrapper[4941]: I0307 06:55:36.916839 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 06:55:38 crc kubenswrapper[4941]: I0307 06:55:38.724051 4941 patch_prober.go:28] interesting pod/controller-manager-66dd75d944-rw77d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 07 06:55:38 crc kubenswrapper[4941]: I0307 06:55:38.724125 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" podUID="e166d976-6595-48f1-be9a-7c5b64567366" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.233482 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.241030 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.354325 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.354380 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.496988 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.869444 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.869833 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.870005 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.870098 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.870206 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-5tr44" Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.870879 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"873ec81fddc92445e44308c67cb3c21ff0087fe62ec875849dbe1245a3902040"} pod="openshift-console/downloads-7954f5f757-5tr44" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.870996 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" containerID="cri-o://873ec81fddc92445e44308c67cb3c21ff0087fe62ec875849dbe1245a3902040" gracePeriod=2 Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.871037 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:55:40 crc kubenswrapper[4941]: I0307 06:55:40.871231 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:55:41 crc kubenswrapper[4941]: I0307 06:55:41.287439 4941 patch_prober.go:28] interesting pod/route-controller-manager-57c58dd795-zr2nt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Mar 07 06:55:41 crc kubenswrapper[4941]: I0307 06:55:41.289699 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" podUID="6d3a15ee-eb4d-4c17-8e55-4c6323437de5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Mar 07 06:55:46 crc kubenswrapper[4941]: I0307 06:55:46.172391 4941 ???:1] "http: TLS handshake error from 192.168.126.11:50008: no serving certificate available for the kubelet" Mar 07 06:55:48 crc kubenswrapper[4941]: I0307 06:55:48.722495 4941 patch_prober.go:28] interesting pod/controller-manager-66dd75d944-rw77d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 07 06:55:48 crc kubenswrapper[4941]: I0307 06:55:48.722560 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" podUID="e166d976-6595-48f1-be9a-7c5b64567366" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 07 06:55:50 crc kubenswrapper[4941]: I0307 06:55:50.869818 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:55:50 crc kubenswrapper[4941]: I0307 06:55:50.870307 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.006150 4941 generic.go:334] "Generic (PLEG): container finished" podID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerID="873ec81fddc92445e44308c67cb3c21ff0087fe62ec875849dbe1245a3902040" exitCode=0 Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.006219 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5tr44" event={"ID":"a77c6084-94de-4ebc-9a75-a83efa28b094","Type":"ContainerDied","Data":"873ec81fddc92445e44308c67cb3c21ff0087fe62ec875849dbe1245a3902040"} Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.281991 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p666q" Mar 07 06:55:51 crc kubenswrapper[4941]: E0307 06:55:51.333390 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 07 06:55:51 crc kubenswrapper[4941]: E0307 06:55:51.334175 4941 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 06:55:51 crc kubenswrapper[4941]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 07 06:55:51 crc kubenswrapper[4941]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sxhjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29547774-2b2fh_openshift-infra(7566a1ff-1f6b-4e99-903b-ff036f98c411): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 07 06:55:51 crc kubenswrapper[4941]: > logger="UnhandledError" Mar 07 06:55:51 crc kubenswrapper[4941]: E0307 06:55:51.335461 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29547774-2b2fh" podUID="7566a1ff-1f6b-4e99-903b-ff036f98c411" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.713790 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.715334 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.734765 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-client-ca\") pod \"e166d976-6595-48f1-be9a-7c5b64567366\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.734853 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-config\") pod \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.734891 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8wgm\" (UniqueName: \"kubernetes.io/projected/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-kube-api-access-m8wgm\") pod \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.734937 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-proxy-ca-bundles\") pod \"e166d976-6595-48f1-be9a-7c5b64567366\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.734990 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e166d976-6595-48f1-be9a-7c5b64567366-serving-cert\") pod \"e166d976-6595-48f1-be9a-7c5b64567366\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.735025 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-client-ca\") pod \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.735096 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq6hx\" (UniqueName: \"kubernetes.io/projected/e166d976-6595-48f1-be9a-7c5b64567366-kube-api-access-vq6hx\") pod \"e166d976-6595-48f1-be9a-7c5b64567366\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.735125 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-serving-cert\") pod \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\" (UID: \"6d3a15ee-eb4d-4c17-8e55-4c6323437de5\") " Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.735165 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-config\") pod \"e166d976-6595-48f1-be9a-7c5b64567366\" (UID: \"e166d976-6595-48f1-be9a-7c5b64567366\") " Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.737099 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-config" (OuterVolumeSpecName: "config") pod "e166d976-6595-48f1-be9a-7c5b64567366" (UID: "e166d976-6595-48f1-be9a-7c5b64567366"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.737128 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e166d976-6595-48f1-be9a-7c5b64567366" (UID: "e166d976-6595-48f1-be9a-7c5b64567366"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.737570 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-client-ca" (OuterVolumeSpecName: "client-ca") pod "e166d976-6595-48f1-be9a-7c5b64567366" (UID: "e166d976-6595-48f1-be9a-7c5b64567366"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.737594 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-config" (OuterVolumeSpecName: "config") pod "6d3a15ee-eb4d-4c17-8e55-4c6323437de5" (UID: "6d3a15ee-eb4d-4c17-8e55-4c6323437de5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.738032 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-client-ca" (OuterVolumeSpecName: "client-ca") pod "6d3a15ee-eb4d-4c17-8e55-4c6323437de5" (UID: "6d3a15ee-eb4d-4c17-8e55-4c6323437de5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.750466 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66"] Mar 07 06:55:51 crc kubenswrapper[4941]: E0307 06:55:51.750757 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54fec849-0c4e-4d9b-bc6a-f11742474018" containerName="pruner" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.750773 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fec849-0c4e-4d9b-bc6a-f11742474018" containerName="pruner" Mar 07 06:55:51 crc kubenswrapper[4941]: E0307 06:55:51.750785 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3a15ee-eb4d-4c17-8e55-4c6323437de5" containerName="route-controller-manager" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.750827 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3a15ee-eb4d-4c17-8e55-4c6323437de5" containerName="route-controller-manager" Mar 07 06:55:51 crc kubenswrapper[4941]: E0307 06:55:51.750842 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ddba3f-71ed-49d2-a174-2d59cc716962" containerName="pruner" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.750850 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ddba3f-71ed-49d2-a174-2d59cc716962" containerName="pruner" Mar 07 06:55:51 crc kubenswrapper[4941]: E0307 06:55:51.750861 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e166d976-6595-48f1-be9a-7c5b64567366" containerName="controller-manager" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.750868 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e166d976-6595-48f1-be9a-7c5b64567366" containerName="controller-manager" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.750989 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d3a15ee-eb4d-4c17-8e55-4c6323437de5" containerName="route-controller-manager" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.751003 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="54fec849-0c4e-4d9b-bc6a-f11742474018" containerName="pruner" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.751018 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e166d976-6595-48f1-be9a-7c5b64567366" containerName="controller-manager" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.751031 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ddba3f-71ed-49d2-a174-2d59cc716962" containerName="pruner" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.751816 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.756885 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e166d976-6595-48f1-be9a-7c5b64567366-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e166d976-6595-48f1-be9a-7c5b64567366" (UID: "e166d976-6595-48f1-be9a-7c5b64567366"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.759719 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66"] Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.760129 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e166d976-6595-48f1-be9a-7c5b64567366-kube-api-access-vq6hx" (OuterVolumeSpecName: "kube-api-access-vq6hx") pod "e166d976-6595-48f1-be9a-7c5b64567366" (UID: "e166d976-6595-48f1-be9a-7c5b64567366"). InnerVolumeSpecName "kube-api-access-vq6hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.760521 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6d3a15ee-eb4d-4c17-8e55-4c6323437de5" (UID: "6d3a15ee-eb4d-4c17-8e55-4c6323437de5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.763572 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-kube-api-access-m8wgm" (OuterVolumeSpecName: "kube-api-access-m8wgm") pod "6d3a15ee-eb4d-4c17-8e55-4c6323437de5" (UID: "6d3a15ee-eb4d-4c17-8e55-4c6323437de5"). InnerVolumeSpecName "kube-api-access-m8wgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.837247 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/481e3303-aed7-4e24-b1ce-62f444e7fc1f-config\") pod \"route-controller-manager-7487c9d76-2sm66\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.837335 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/481e3303-aed7-4e24-b1ce-62f444e7fc1f-serving-cert\") pod \"route-controller-manager-7487c9d76-2sm66\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.837538 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t66bn\" (UniqueName: \"kubernetes.io/projected/481e3303-aed7-4e24-b1ce-62f444e7fc1f-kube-api-access-t66bn\") pod \"route-controller-manager-7487c9d76-2sm66\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.837752 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/481e3303-aed7-4e24-b1ce-62f444e7fc1f-client-ca\") pod \"route-controller-manager-7487c9d76-2sm66\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.837904 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.837937 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8wgm\" (UniqueName: \"kubernetes.io/projected/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-kube-api-access-m8wgm\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.837949 4941 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.837958 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e166d976-6595-48f1-be9a-7c5b64567366-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.837967 4941 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.837977 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq6hx\" (UniqueName: \"kubernetes.io/projected/e166d976-6595-48f1-be9a-7c5b64567366-kube-api-access-vq6hx\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.837985 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d3a15ee-eb4d-4c17-8e55-4c6323437de5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.837993 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.838016 4941 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e166d976-6595-48f1-be9a-7c5b64567366-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.939680 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/481e3303-aed7-4e24-b1ce-62f444e7fc1f-config\") pod \"route-controller-manager-7487c9d76-2sm66\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.939793 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/481e3303-aed7-4e24-b1ce-62f444e7fc1f-serving-cert\") pod \"route-controller-manager-7487c9d76-2sm66\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.939830 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t66bn\" (UniqueName: \"kubernetes.io/projected/481e3303-aed7-4e24-b1ce-62f444e7fc1f-kube-api-access-t66bn\") pod \"route-controller-manager-7487c9d76-2sm66\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.939914 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/481e3303-aed7-4e24-b1ce-62f444e7fc1f-client-ca\") pod \"route-controller-manager-7487c9d76-2sm66\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.941299 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/481e3303-aed7-4e24-b1ce-62f444e7fc1f-client-ca\") pod \"route-controller-manager-7487c9d76-2sm66\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.941531 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/481e3303-aed7-4e24-b1ce-62f444e7fc1f-config\") pod \"route-controller-manager-7487c9d76-2sm66\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.943968 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/481e3303-aed7-4e24-b1ce-62f444e7fc1f-serving-cert\") pod \"route-controller-manager-7487c9d76-2sm66\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:51 crc kubenswrapper[4941]: I0307 06:55:51.957738 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t66bn\" (UniqueName: \"kubernetes.io/projected/481e3303-aed7-4e24-b1ce-62f444e7fc1f-kube-api-access-t66bn\") pod \"route-controller-manager-7487c9d76-2sm66\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:52 crc kubenswrapper[4941]: I0307 06:55:52.013954 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" event={"ID":"e166d976-6595-48f1-be9a-7c5b64567366","Type":"ContainerDied","Data":"71891a3fb8c033074983e3a0147fa29d34032404617d023a61be4e4e0afff3ec"} Mar 07 06:55:52 crc kubenswrapper[4941]: I0307 06:55:52.014013 4941 scope.go:117] "RemoveContainer" containerID="68cac33d325e29b9d278f68b552915eccc37d52f282843f15c99e2ef0135a333" Mar 07 06:55:52 crc kubenswrapper[4941]: I0307 06:55:52.014124 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66dd75d944-rw77d" Mar 07 06:55:52 crc kubenswrapper[4941]: I0307 06:55:52.021915 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" Mar 07 06:55:52 crc kubenswrapper[4941]: I0307 06:55:52.021893 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" event={"ID":"6d3a15ee-eb4d-4c17-8e55-4c6323437de5","Type":"ContainerDied","Data":"64593f98da08597dcdf7e5bf04ad064cc4483af0ca1a0d76d2cc2528054ce7a9"} Mar 07 06:55:52 crc kubenswrapper[4941]: E0307 06:55:52.023614 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29547774-2b2fh" podUID="7566a1ff-1f6b-4e99-903b-ff036f98c411" Mar 07 06:55:52 crc kubenswrapper[4941]: I0307 06:55:52.034152 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66dd75d944-rw77d"] Mar 07 06:55:52 crc kubenswrapper[4941]: I0307 06:55:52.036810 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66dd75d944-rw77d"] Mar 07 06:55:52 crc kubenswrapper[4941]: I0307 06:55:52.066609 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt"] Mar 07 06:55:52 crc kubenswrapper[4941]: I0307 06:55:52.068676 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt"] Mar 07 06:55:52 crc kubenswrapper[4941]: I0307 06:55:52.098435 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:55:52 crc kubenswrapper[4941]: I0307 06:55:52.286600 4941 patch_prober.go:28] interesting pod/route-controller-manager-57c58dd795-zr2nt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 06:55:52 crc kubenswrapper[4941]: I0307 06:55:52.286697 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-57c58dd795-zr2nt" podUID="6d3a15ee-eb4d-4c17-8e55-4c6323437de5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 06:55:53 crc kubenswrapper[4941]: I0307 06:55:53.964781 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d3a15ee-eb4d-4c17-8e55-4c6323437de5" path="/var/lib/kubelet/pods/6d3a15ee-eb4d-4c17-8e55-4c6323437de5/volumes" Mar 07 06:55:53 crc kubenswrapper[4941]: I0307 06:55:53.967681 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e166d976-6595-48f1-be9a-7c5b64567366" path="/var/lib/kubelet/pods/e166d976-6595-48f1-be9a-7c5b64567366/volumes" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.633964 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59694df6ff-qr8hr"] Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.635549 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.638114 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.638241 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.640693 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.640720 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.640714 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.640845 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.651241 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.653867 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59694df6ff-qr8hr"] Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.676863 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-config\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.676934 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-client-ca\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.676997 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-proxy-ca-bundles\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.677024 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv88f\" (UniqueName: \"kubernetes.io/projected/2ccd6084-ccb5-405e-ab81-d5447115d772-kube-api-access-hv88f\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.677039 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ccd6084-ccb5-405e-ab81-d5447115d772-serving-cert\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.702837 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66"] Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.778218 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-client-ca\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.778313 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-proxy-ca-bundles\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.778350 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv88f\" (UniqueName: \"kubernetes.io/projected/2ccd6084-ccb5-405e-ab81-d5447115d772-kube-api-access-hv88f\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.778380 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ccd6084-ccb5-405e-ab81-d5447115d772-serving-cert\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.778425 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-config\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.779671 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-proxy-ca-bundles\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.779798 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-config\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.779805 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-client-ca\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.787287 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ccd6084-ccb5-405e-ab81-d5447115d772-serving-cert\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.796666 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv88f\" (UniqueName: \"kubernetes.io/projected/2ccd6084-ccb5-405e-ab81-d5447115d772-kube-api-access-hv88f\") pod \"controller-manager-59694df6ff-qr8hr\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:54 crc kubenswrapper[4941]: I0307 06:55:54.971497 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:55:55 crc kubenswrapper[4941]: I0307 06:55:55.238072 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 07 06:55:55 crc kubenswrapper[4941]: I0307 06:55:55.238858 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 06:55:55 crc kubenswrapper[4941]: I0307 06:55:55.243162 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 07 06:55:55 crc kubenswrapper[4941]: I0307 06:55:55.243205 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 07 06:55:55 crc kubenswrapper[4941]: I0307 06:55:55.248563 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 07 06:55:55 crc kubenswrapper[4941]: I0307 06:55:55.284078 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a2ec596-b148-426d-a94f-b64539627393-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4a2ec596-b148-426d-a94f-b64539627393\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 06:55:55 crc kubenswrapper[4941]: I0307 06:55:55.284130 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a2ec596-b148-426d-a94f-b64539627393-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4a2ec596-b148-426d-a94f-b64539627393\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 06:55:55 crc kubenswrapper[4941]: I0307 06:55:55.385758 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a2ec596-b148-426d-a94f-b64539627393-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4a2ec596-b148-426d-a94f-b64539627393\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 06:55:55 crc kubenswrapper[4941]: I0307 06:55:55.386214 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a2ec596-b148-426d-a94f-b64539627393-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4a2ec596-b148-426d-a94f-b64539627393\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 06:55:55 crc kubenswrapper[4941]: I0307 06:55:55.385927 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a2ec596-b148-426d-a94f-b64539627393-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4a2ec596-b148-426d-a94f-b64539627393\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 06:55:55 crc kubenswrapper[4941]: I0307 06:55:55.404375 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a2ec596-b148-426d-a94f-b64539627393-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4a2ec596-b148-426d-a94f-b64539627393\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 06:55:55 crc kubenswrapper[4941]: I0307 06:55:55.571049 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 06:55:56 crc kubenswrapper[4941]: E0307 06:55:56.435357 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 07 06:55:56 crc kubenswrapper[4941]: E0307 06:55:56.436228 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8qhtv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xsszj_openshift-marketplace(715e8d60-13c8-442f-bec0-2f2fd1cfe172): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 06:55:56 crc kubenswrapper[4941]: E0307 06:55:56.437574 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-xsszj" podUID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" Mar 07 06:55:58 crc kubenswrapper[4941]: I0307 06:55:58.291019 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 06:55:58 crc kubenswrapper[4941]: I0307 06:55:58.801277 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8n7vr"] Mar 07 06:55:59 crc kubenswrapper[4941]: I0307 06:55:59.440897 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 07 06:55:59 crc kubenswrapper[4941]: I0307 06:55:59.443763 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 07 06:55:59 crc kubenswrapper[4941]: I0307 06:55:59.445274 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 07 06:55:59 crc kubenswrapper[4941]: I0307 06:55:59.540513 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-kube-api-access\") pod \"installer-9-crc\" (UID: \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 06:55:59 crc kubenswrapper[4941]: I0307 06:55:59.540805 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 06:55:59 crc kubenswrapper[4941]: I0307 06:55:59.540848 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-var-lock\") pod \"installer-9-crc\" (UID: \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 06:55:59 crc kubenswrapper[4941]: I0307 06:55:59.641977 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-var-lock\") pod \"installer-9-crc\" (UID: \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 06:55:59 crc kubenswrapper[4941]: I0307 06:55:59.642064 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-kube-api-access\") pod \"installer-9-crc\" (UID: \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 06:55:59 crc kubenswrapper[4941]: I0307 06:55:59.642122 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 06:55:59 crc kubenswrapper[4941]: I0307 06:55:59.642192 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 06:55:59 crc kubenswrapper[4941]: I0307 06:55:59.642226 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-var-lock\") pod \"installer-9-crc\" (UID: \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 06:55:59 crc kubenswrapper[4941]: I0307 06:55:59.664706 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-kube-api-access\") pod \"installer-9-crc\" (UID: \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 06:55:59 crc kubenswrapper[4941]: I0307 06:55:59.774182 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 07 06:56:00 crc kubenswrapper[4941]: I0307 06:56:00.131600 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547776-g4ksl"] Mar 07 06:56:00 crc kubenswrapper[4941]: I0307 06:56:00.132369 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547776-g4ksl" Mar 07 06:56:00 crc kubenswrapper[4941]: I0307 06:56:00.134530 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 06:56:00 crc kubenswrapper[4941]: I0307 06:56:00.143134 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547776-g4ksl"] Mar 07 06:56:00 crc kubenswrapper[4941]: I0307 06:56:00.149474 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xntqs\" (UniqueName: \"kubernetes.io/projected/e3fa14c7-e4f9-42fc-8972-8d18263ee801-kube-api-access-xntqs\") pod \"auto-csr-approver-29547776-g4ksl\" (UID: \"e3fa14c7-e4f9-42fc-8972-8d18263ee801\") " pod="openshift-infra/auto-csr-approver-29547776-g4ksl" Mar 07 06:56:00 crc kubenswrapper[4941]: I0307 06:56:00.251598 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xntqs\" (UniqueName: \"kubernetes.io/projected/e3fa14c7-e4f9-42fc-8972-8d18263ee801-kube-api-access-xntqs\") pod \"auto-csr-approver-29547776-g4ksl\" (UID: \"e3fa14c7-e4f9-42fc-8972-8d18263ee801\") " pod="openshift-infra/auto-csr-approver-29547776-g4ksl" Mar 07 06:56:00 crc kubenswrapper[4941]: I0307 06:56:00.269287 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xntqs\" (UniqueName: \"kubernetes.io/projected/e3fa14c7-e4f9-42fc-8972-8d18263ee801-kube-api-access-xntqs\") pod \"auto-csr-approver-29547776-g4ksl\" (UID: \"e3fa14c7-e4f9-42fc-8972-8d18263ee801\") " pod="openshift-infra/auto-csr-approver-29547776-g4ksl" Mar 07 06:56:00 crc kubenswrapper[4941]: I0307 06:56:00.452822 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547776-g4ksl" Mar 07 06:56:00 crc kubenswrapper[4941]: E0307 06:56:00.456460 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xsszj" podUID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" Mar 07 06:56:00 crc kubenswrapper[4941]: E0307 06:56:00.570550 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 07 06:56:00 crc kubenswrapper[4941]: E0307 06:56:00.571240 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r5lmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ktp7f_openshift-marketplace(d003569d-8946-47e7-adf2-5148ca8de944): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 06:56:00 crc kubenswrapper[4941]: E0307 06:56:00.570588 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 07 06:56:00 crc kubenswrapper[4941]: E0307 06:56:00.571359 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-254k7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qx5dk_openshift-marketplace(bdb71b40-ad9b-405b-a178-158109d65a92): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 06:56:00 crc kubenswrapper[4941]: E0307 06:56:00.572453 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qx5dk" podUID="bdb71b40-ad9b-405b-a178-158109d65a92" Mar 07 06:56:00 crc kubenswrapper[4941]: E0307 06:56:00.572578 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ktp7f" podUID="d003569d-8946-47e7-adf2-5148ca8de944" Mar 07 06:56:00 crc kubenswrapper[4941]: I0307 06:56:00.872691 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:56:00 crc kubenswrapper[4941]: I0307 06:56:00.872770 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:56:01 crc kubenswrapper[4941]: E0307 06:56:01.885920 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qx5dk" podUID="bdb71b40-ad9b-405b-a178-158109d65a92" Mar 07 06:56:01 crc kubenswrapper[4941]: E0307 06:56:01.886043 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-ktp7f" podUID="d003569d-8946-47e7-adf2-5148ca8de944" Mar 07 06:56:01 crc kubenswrapper[4941]: E0307 06:56:01.957338 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 07 06:56:01 crc kubenswrapper[4941]: E0307 06:56:01.957588 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7qw9x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vtxk7_openshift-marketplace(37622fc0-c5dc-4e0d-848a-214bce293f7f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 06:56:01 crc kubenswrapper[4941]: E0307 06:56:01.958805 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vtxk7" podUID="37622fc0-c5dc-4e0d-848a-214bce293f7f" Mar 07 06:56:01 crc kubenswrapper[4941]: E0307 06:56:01.989388 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 07 06:56:01 crc kubenswrapper[4941]: E0307 06:56:01.989607 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2k5mh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-g4dmh_openshift-marketplace(36212ca9-755e-4104-a203-7c136afbfca9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 06:56:01 crc kubenswrapper[4941]: E0307 06:56:01.990839 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-g4dmh" podUID="36212ca9-755e-4104-a203-7c136afbfca9" Mar 07 06:56:03 crc kubenswrapper[4941]: E0307 06:56:03.569646 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vtxk7" podUID="37622fc0-c5dc-4e0d-848a-214bce293f7f" Mar 07 06:56:03 crc kubenswrapper[4941]: E0307 06:56:03.569686 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-g4dmh" podUID="36212ca9-755e-4104-a203-7c136afbfca9" Mar 07 06:56:03 crc kubenswrapper[4941]: I0307 06:56:03.580251 4941 scope.go:117] "RemoveContainer" containerID="ee43eb1a4c3bc7552b813e059be0f313e9ef224d9b01ad892ad7e2d164e22781" Mar 07 06:56:03 crc kubenswrapper[4941]: E0307 06:56:03.664364 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 07 06:56:03 crc kubenswrapper[4941]: E0307 06:56:03.665022 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4pcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jg549_openshift-marketplace(86719fee-4b62-4f53-958e-9e87f56a9062): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 06:56:03 crc kubenswrapper[4941]: E0307 06:56:03.666254 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jg549" podUID="86719fee-4b62-4f53-958e-9e87f56a9062" Mar 07 06:56:03 crc kubenswrapper[4941]: E0307 06:56:03.686746 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 07 06:56:03 crc kubenswrapper[4941]: E0307 06:56:03.686921 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4cs5v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-z4zqp_openshift-marketplace(805b56ac-66fd-4704-adb1-f3968f17f835): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 06:56:03 crc kubenswrapper[4941]: E0307 06:56:03.688197 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-z4zqp" podUID="805b56ac-66fd-4704-adb1-f3968f17f835" Mar 07 06:56:03 crc kubenswrapper[4941]: E0307 06:56:03.698063 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 07 06:56:03 crc kubenswrapper[4941]: E0307 06:56:03.698240 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4mz8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5v5x2_openshift-marketplace(a10e3708-a476-4698-aa8d-ba99a795524a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 06:56:03 crc kubenswrapper[4941]: E0307 06:56:03.699463 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5v5x2" podUID="a10e3708-a476-4698-aa8d-ba99a795524a" Mar 07 06:56:04 crc kubenswrapper[4941]: I0307 06:56:04.069520 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 07 06:56:04 crc kubenswrapper[4941]: I0307 06:56:04.091496 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66"] Mar 07 06:56:04 crc kubenswrapper[4941]: I0307 06:56:04.108005 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5tr44" event={"ID":"a77c6084-94de-4ebc-9a75-a83efa28b094","Type":"ContainerStarted","Data":"923ff06c772bba5ab0e9973359df75d9b5ce119365f0ab372a2ed333a0f791f2"} Mar 07 06:56:04 crc kubenswrapper[4941]: I0307 06:56:04.108311 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5tr44" Mar 07 06:56:04 crc kubenswrapper[4941]: I0307 06:56:04.108950 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:56:04 crc kubenswrapper[4941]: I0307 06:56:04.109003 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:56:04 crc kubenswrapper[4941]: E0307 06:56:04.115145 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5v5x2" podUID="a10e3708-a476-4698-aa8d-ba99a795524a" Mar 07 06:56:04 crc kubenswrapper[4941]: E0307 06:56:04.115452 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-z4zqp" podUID="805b56ac-66fd-4704-adb1-f3968f17f835" Mar 07 06:56:04 crc kubenswrapper[4941]: E0307 06:56:04.115521 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jg549" podUID="86719fee-4b62-4f53-958e-9e87f56a9062" Mar 07 06:56:04 crc kubenswrapper[4941]: I0307 06:56:04.155548 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 07 06:56:04 crc kubenswrapper[4941]: W0307 06:56:04.166343 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb5abd453_0ae9_420c_92b5_84b76e1b4a6a.slice/crio-1e11e91ea2085cccd761445af3c67ae0a50b428b64b88bdf9674f2fa79b166e2 WatchSource:0}: Error finding container 1e11e91ea2085cccd761445af3c67ae0a50b428b64b88bdf9674f2fa79b166e2: Status 404 returned error can't find the container with id 1e11e91ea2085cccd761445af3c67ae0a50b428b64b88bdf9674f2fa79b166e2 Mar 07 06:56:04 crc kubenswrapper[4941]: I0307 06:56:04.206794 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59694df6ff-qr8hr"] Mar 07 06:56:04 crc kubenswrapper[4941]: I0307 06:56:04.210906 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547776-g4ksl"] Mar 07 06:56:04 crc kubenswrapper[4941]: W0307 06:56:04.220448 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ccd6084_ccb5_405e_ab81_d5447115d772.slice/crio-f7558d1952c5804d6b280746acdfec0467da19a325b132bd4943f89c3068f784 WatchSource:0}: Error finding container f7558d1952c5804d6b280746acdfec0467da19a325b132bd4943f89c3068f784: Status 404 returned error can't find the container with id f7558d1952c5804d6b280746acdfec0467da19a325b132bd4943f89c3068f784 Mar 07 06:56:04 crc kubenswrapper[4941]: W0307 06:56:04.227545 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3fa14c7_e4f9_42fc_8972_8d18263ee801.slice/crio-5c2956e0bf25ebfee7d4dce6923f70654e20ae844f0c1890aa5adc61f53c5803 WatchSource:0}: Error finding container 5c2956e0bf25ebfee7d4dce6923f70654e20ae844f0c1890aa5adc61f53c5803: Status 404 returned error can't find the container with id 5c2956e0bf25ebfee7d4dce6923f70654e20ae844f0c1890aa5adc61f53c5803 Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.116098 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547776-g4ksl" event={"ID":"e3fa14c7-e4f9-42fc-8972-8d18263ee801","Type":"ContainerStarted","Data":"5c2956e0bf25ebfee7d4dce6923f70654e20ae844f0c1890aa5adc61f53c5803"} Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.117528 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b5abd453-0ae9-420c-92b5-84b76e1b4a6a","Type":"ContainerStarted","Data":"b380491549d4656d158f042ce2c960e96d68a628025a0caceea954b15a9a0b93"} Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.117584 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b5abd453-0ae9-420c-92b5-84b76e1b4a6a","Type":"ContainerStarted","Data":"1e11e91ea2085cccd761445af3c67ae0a50b428b64b88bdf9674f2fa79b166e2"} Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.118748 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" event={"ID":"2ccd6084-ccb5-405e-ab81-d5447115d772","Type":"ContainerStarted","Data":"8500e5b87a62f6b9679e076ea70f6f35bdd35bbaf6261b806c685d2a18952e17"} Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.118801 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" event={"ID":"2ccd6084-ccb5-405e-ab81-d5447115d772","Type":"ContainerStarted","Data":"f7558d1952c5804d6b280746acdfec0467da19a325b132bd4943f89c3068f784"} Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.118820 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.121094 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4a2ec596-b148-426d-a94f-b64539627393","Type":"ContainerStarted","Data":"bc269d599c0448ce57abeb7f94fe5c0aea37285eeda0d1bfad5ae7124f8574bb"} Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.121141 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4a2ec596-b148-426d-a94f-b64539627393","Type":"ContainerStarted","Data":"df5f6a94a567c6909d9b4854aed1030f982ceff0ceae94f9a3fbc3705349bf71"} Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.124595 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" podUID="481e3303-aed7-4e24-b1ce-62f444e7fc1f" containerName="route-controller-manager" containerID="cri-o://cb5a9b94dd254c79fe469976f54c30eb0d69858ae993d588ca0367866f3aa31e" gracePeriod=30 Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.125284 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" event={"ID":"481e3303-aed7-4e24-b1ce-62f444e7fc1f","Type":"ContainerStarted","Data":"cb5a9b94dd254c79fe469976f54c30eb0d69858ae993d588ca0367866f3aa31e"} Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.125322 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" event={"ID":"481e3303-aed7-4e24-b1ce-62f444e7fc1f","Type":"ContainerStarted","Data":"6782c399d134175771c82c0866ed8a55c38a662c600148ed11de4fa31266440b"} Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.127581 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.127769 4941 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tr44 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.127824 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tr44" podUID="a77c6084-94de-4ebc-9a75-a83efa28b094" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.129439 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.133869 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.152348 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=6.152329858 podStartE2EDuration="6.152329858s" podCreationTimestamp="2026-03-07 06:55:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:56:05.14981353 +0000 UTC m=+262.102178995" watchObservedRunningTime="2026-03-07 06:56:05.152329858 +0000 UTC m=+262.104695323" Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.172173 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" podStartSLOduration=11.172149649 podStartE2EDuration="11.172149649s" podCreationTimestamp="2026-03-07 06:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:56:05.17146734 +0000 UTC m=+262.123832805" watchObservedRunningTime="2026-03-07 06:56:05.172149649 +0000 UTC m=+262.124515114" Mar 07 06:56:05 crc kubenswrapper[4941]: I0307 06:56:05.187704 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" podStartSLOduration=31.187677203 podStartE2EDuration="31.187677203s" podCreationTimestamp="2026-03-07 06:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:56:05.187245331 +0000 UTC m=+262.139610826" watchObservedRunningTime="2026-03-07 06:56:05.187677203 +0000 UTC m=+262.140042668" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.141112 4941 generic.go:334] "Generic (PLEG): container finished" podID="4a2ec596-b148-426d-a94f-b64539627393" containerID="bc269d599c0448ce57abeb7f94fe5c0aea37285eeda0d1bfad5ae7124f8574bb" exitCode=0 Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.141225 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4a2ec596-b148-426d-a94f-b64539627393","Type":"ContainerDied","Data":"bc269d599c0448ce57abeb7f94fe5c0aea37285eeda0d1bfad5ae7124f8574bb"} Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.146856 4941 generic.go:334] "Generic (PLEG): container finished" podID="481e3303-aed7-4e24-b1ce-62f444e7fc1f" containerID="cb5a9b94dd254c79fe469976f54c30eb0d69858ae993d588ca0367866f3aa31e" exitCode=0 Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.146934 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" event={"ID":"481e3303-aed7-4e24-b1ce-62f444e7fc1f","Type":"ContainerDied","Data":"cb5a9b94dd254c79fe469976f54c30eb0d69858ae993d588ca0367866f3aa31e"} Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.658377 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.672039 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t66bn\" (UniqueName: \"kubernetes.io/projected/481e3303-aed7-4e24-b1ce-62f444e7fc1f-kube-api-access-t66bn\") pod \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.672096 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/481e3303-aed7-4e24-b1ce-62f444e7fc1f-serving-cert\") pod \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.672141 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/481e3303-aed7-4e24-b1ce-62f444e7fc1f-config\") pod \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.672212 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/481e3303-aed7-4e24-b1ce-62f444e7fc1f-client-ca\") pod \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\" (UID: \"481e3303-aed7-4e24-b1ce-62f444e7fc1f\") " Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.673074 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/481e3303-aed7-4e24-b1ce-62f444e7fc1f-client-ca" (OuterVolumeSpecName: "client-ca") pod "481e3303-aed7-4e24-b1ce-62f444e7fc1f" (UID: "481e3303-aed7-4e24-b1ce-62f444e7fc1f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.673267 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/481e3303-aed7-4e24-b1ce-62f444e7fc1f-config" (OuterVolumeSpecName: "config") pod "481e3303-aed7-4e24-b1ce-62f444e7fc1f" (UID: "481e3303-aed7-4e24-b1ce-62f444e7fc1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.679397 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481e3303-aed7-4e24-b1ce-62f444e7fc1f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "481e3303-aed7-4e24-b1ce-62f444e7fc1f" (UID: "481e3303-aed7-4e24-b1ce-62f444e7fc1f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.679866 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481e3303-aed7-4e24-b1ce-62f444e7fc1f-kube-api-access-t66bn" (OuterVolumeSpecName: "kube-api-access-t66bn") pod "481e3303-aed7-4e24-b1ce-62f444e7fc1f" (UID: "481e3303-aed7-4e24-b1ce-62f444e7fc1f"). InnerVolumeSpecName "kube-api-access-t66bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.691664 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8"] Mar 07 06:56:06 crc kubenswrapper[4941]: E0307 06:56:06.691903 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481e3303-aed7-4e24-b1ce-62f444e7fc1f" containerName="route-controller-manager" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.691916 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="481e3303-aed7-4e24-b1ce-62f444e7fc1f" containerName="route-controller-manager" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.692026 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="481e3303-aed7-4e24-b1ce-62f444e7fc1f" containerName="route-controller-manager" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.692460 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.717152 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8"] Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.773696 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b829b8-41e3-4d54-a4c0-66338dd0d902-config\") pod \"route-controller-manager-848dff8976-xv9k8\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.773752 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b829b8-41e3-4d54-a4c0-66338dd0d902-serving-cert\") pod \"route-controller-manager-848dff8976-xv9k8\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.773786 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcg99\" (UniqueName: \"kubernetes.io/projected/e3b829b8-41e3-4d54-a4c0-66338dd0d902-kube-api-access-rcg99\") pod \"route-controller-manager-848dff8976-xv9k8\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.773817 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b829b8-41e3-4d54-a4c0-66338dd0d902-client-ca\") pod \"route-controller-manager-848dff8976-xv9k8\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.773910 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t66bn\" (UniqueName: \"kubernetes.io/projected/481e3303-aed7-4e24-b1ce-62f444e7fc1f-kube-api-access-t66bn\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.773925 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/481e3303-aed7-4e24-b1ce-62f444e7fc1f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.773937 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/481e3303-aed7-4e24-b1ce-62f444e7fc1f-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.773948 4941 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/481e3303-aed7-4e24-b1ce-62f444e7fc1f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.875465 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b829b8-41e3-4d54-a4c0-66338dd0d902-config\") pod \"route-controller-manager-848dff8976-xv9k8\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.875521 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b829b8-41e3-4d54-a4c0-66338dd0d902-serving-cert\") pod \"route-controller-manager-848dff8976-xv9k8\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.875544 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcg99\" (UniqueName: \"kubernetes.io/projected/e3b829b8-41e3-4d54-a4c0-66338dd0d902-kube-api-access-rcg99\") pod \"route-controller-manager-848dff8976-xv9k8\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.875568 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b829b8-41e3-4d54-a4c0-66338dd0d902-client-ca\") pod \"route-controller-manager-848dff8976-xv9k8\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.876679 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b829b8-41e3-4d54-a4c0-66338dd0d902-client-ca\") pod \"route-controller-manager-848dff8976-xv9k8\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.877744 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b829b8-41e3-4d54-a4c0-66338dd0d902-config\") pod \"route-controller-manager-848dff8976-xv9k8\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.879868 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b829b8-41e3-4d54-a4c0-66338dd0d902-serving-cert\") pod \"route-controller-manager-848dff8976-xv9k8\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:06 crc kubenswrapper[4941]: I0307 06:56:06.892543 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcg99\" (UniqueName: \"kubernetes.io/projected/e3b829b8-41e3-4d54-a4c0-66338dd0d902-kube-api-access-rcg99\") pod \"route-controller-manager-848dff8976-xv9k8\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.049380 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.154147 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" event={"ID":"481e3303-aed7-4e24-b1ce-62f444e7fc1f","Type":"ContainerDied","Data":"6782c399d134175771c82c0866ed8a55c38a662c600148ed11de4fa31266440b"} Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.154209 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66" Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.154224 4941 scope.go:117] "RemoveContainer" containerID="cb5a9b94dd254c79fe469976f54c30eb0d69858ae993d588ca0367866f3aa31e" Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.192512 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66"] Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.201019 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7487c9d76-2sm66"] Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.434566 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.582204 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a2ec596-b148-426d-a94f-b64539627393-kubelet-dir\") pod \"4a2ec596-b148-426d-a94f-b64539627393\" (UID: \"4a2ec596-b148-426d-a94f-b64539627393\") " Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.582299 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a2ec596-b148-426d-a94f-b64539627393-kube-api-access\") pod \"4a2ec596-b148-426d-a94f-b64539627393\" (UID: \"4a2ec596-b148-426d-a94f-b64539627393\") " Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.582582 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a2ec596-b148-426d-a94f-b64539627393-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4a2ec596-b148-426d-a94f-b64539627393" (UID: "4a2ec596-b148-426d-a94f-b64539627393"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.590532 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2ec596-b148-426d-a94f-b64539627393-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4a2ec596-b148-426d-a94f-b64539627393" (UID: "4a2ec596-b148-426d-a94f-b64539627393"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.684346 4941 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a2ec596-b148-426d-a94f-b64539627393-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.684856 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a2ec596-b148-426d-a94f-b64539627393-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.865994 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8"] Mar 07 06:56:07 crc kubenswrapper[4941]: W0307 06:56:07.873374 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3b829b8_41e3_4d54_a4c0_66338dd0d902.slice/crio-87f09fb1980b1db6dbaa317cc4029a499c36460580c4fa92a1f1d2346686f95e WatchSource:0}: Error finding container 87f09fb1980b1db6dbaa317cc4029a499c36460580c4fa92a1f1d2346686f95e: Status 404 returned error can't find the container with id 87f09fb1980b1db6dbaa317cc4029a499c36460580c4fa92a1f1d2346686f95e Mar 07 06:56:07 crc kubenswrapper[4941]: I0307 06:56:07.961611 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481e3303-aed7-4e24-b1ce-62f444e7fc1f" path="/var/lib/kubelet/pods/481e3303-aed7-4e24-b1ce-62f444e7fc1f/volumes" Mar 07 06:56:08 crc kubenswrapper[4941]: I0307 06:56:08.162433 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547774-2b2fh" event={"ID":"7566a1ff-1f6b-4e99-903b-ff036f98c411","Type":"ContainerStarted","Data":"34d2d8c8f7224ea5b19fb4cc349409b66394df63edeaec259d4dc4449b46fb16"} Mar 07 06:56:08 crc kubenswrapper[4941]: I0307 06:56:08.163801 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" event={"ID":"e3b829b8-41e3-4d54-a4c0-66338dd0d902","Type":"ContainerStarted","Data":"c9753e27573bd0f481f3f16c5a699cf8d38254bb916ca3d392a6a7b848a01d9a"} Mar 07 06:56:08 crc kubenswrapper[4941]: I0307 06:56:08.163865 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" event={"ID":"e3b829b8-41e3-4d54-a4c0-66338dd0d902","Type":"ContainerStarted","Data":"87f09fb1980b1db6dbaa317cc4029a499c36460580c4fa92a1f1d2346686f95e"} Mar 07 06:56:08 crc kubenswrapper[4941]: I0307 06:56:08.165273 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547776-g4ksl" event={"ID":"e3fa14c7-e4f9-42fc-8972-8d18263ee801","Type":"ContainerStarted","Data":"e1640f8173deaee95910a78d9af2192ee7718b4419a38bd0a2618c121759c01f"} Mar 07 06:56:08 crc kubenswrapper[4941]: I0307 06:56:08.169392 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4a2ec596-b148-426d-a94f-b64539627393","Type":"ContainerDied","Data":"df5f6a94a567c6909d9b4854aed1030f982ceff0ceae94f9a3fbc3705349bf71"} Mar 07 06:56:08 crc kubenswrapper[4941]: I0307 06:56:08.169527 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df5f6a94a567c6909d9b4854aed1030f982ceff0ceae94f9a3fbc3705349bf71" Mar 07 06:56:08 crc kubenswrapper[4941]: I0307 06:56:08.169478 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 06:56:08 crc kubenswrapper[4941]: I0307 06:56:08.188928 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547776-g4ksl" podStartSLOduration=5.009013147 podStartE2EDuration="8.188909923s" podCreationTimestamp="2026-03-07 06:56:00 +0000 UTC" firstStartedPulling="2026-03-07 06:56:04.251602106 +0000 UTC m=+261.203967571" lastFinishedPulling="2026-03-07 06:56:07.431498892 +0000 UTC m=+264.383864347" observedRunningTime="2026-03-07 06:56:08.184437671 +0000 UTC m=+265.136803136" watchObservedRunningTime="2026-03-07 06:56:08.188909923 +0000 UTC m=+265.141275388" Mar 07 06:56:08 crc kubenswrapper[4941]: I0307 06:56:08.454974 4941 csr.go:261] certificate signing request csr-zxhb7 is approved, waiting to be issued Mar 07 06:56:08 crc kubenswrapper[4941]: I0307 06:56:08.462623 4941 csr.go:257] certificate signing request csr-zxhb7 is issued Mar 07 06:56:09 crc kubenswrapper[4941]: I0307 06:56:09.175475 4941 generic.go:334] "Generic (PLEG): container finished" podID="e3fa14c7-e4f9-42fc-8972-8d18263ee801" containerID="e1640f8173deaee95910a78d9af2192ee7718b4419a38bd0a2618c121759c01f" exitCode=0 Mar 07 06:56:09 crc kubenswrapper[4941]: I0307 06:56:09.175583 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547776-g4ksl" event={"ID":"e3fa14c7-e4f9-42fc-8972-8d18263ee801","Type":"ContainerDied","Data":"e1640f8173deaee95910a78d9af2192ee7718b4419a38bd0a2618c121759c01f"} Mar 07 06:56:09 crc kubenswrapper[4941]: I0307 06:56:09.179907 4941 generic.go:334] "Generic (PLEG): container finished" podID="7566a1ff-1f6b-4e99-903b-ff036f98c411" containerID="34d2d8c8f7224ea5b19fb4cc349409b66394df63edeaec259d4dc4449b46fb16" exitCode=0 Mar 07 06:56:09 crc kubenswrapper[4941]: I0307 06:56:09.180571 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547774-2b2fh" event={"ID":"7566a1ff-1f6b-4e99-903b-ff036f98c411","Type":"ContainerDied","Data":"34d2d8c8f7224ea5b19fb4cc349409b66394df63edeaec259d4dc4449b46fb16"} Mar 07 06:56:09 crc kubenswrapper[4941]: I0307 06:56:09.180616 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:09 crc kubenswrapper[4941]: I0307 06:56:09.190047 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:09 crc kubenswrapper[4941]: I0307 06:56:09.211880 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" podStartSLOduration=15.21185712 podStartE2EDuration="15.21185712s" podCreationTimestamp="2026-03-07 06:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:56:09.205009203 +0000 UTC m=+266.157374678" watchObservedRunningTime="2026-03-07 06:56:09.21185712 +0000 UTC m=+266.164222595" Mar 07 06:56:09 crc kubenswrapper[4941]: I0307 06:56:09.464155 4941 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-28 07:09:58.051329642 +0000 UTC Mar 07 06:56:09 crc kubenswrapper[4941]: I0307 06:56:09.464206 4941 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7104h13m48.587127115s for next certificate rotation Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.314517 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.314577 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.314632 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.315352 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.315446 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a" gracePeriod=600 Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.465164 4941 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-12 18:11:55.096652063 +0000 UTC Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.465550 4941 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7475h15m44.631105817s for next certificate rotation Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.636882 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547776-g4ksl" Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.725098 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xntqs\" (UniqueName: \"kubernetes.io/projected/e3fa14c7-e4f9-42fc-8972-8d18263ee801-kube-api-access-xntqs\") pod \"e3fa14c7-e4f9-42fc-8972-8d18263ee801\" (UID: \"e3fa14c7-e4f9-42fc-8972-8d18263ee801\") " Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.731836 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3fa14c7-e4f9-42fc-8972-8d18263ee801-kube-api-access-xntqs" (OuterVolumeSpecName: "kube-api-access-xntqs") pod "e3fa14c7-e4f9-42fc-8972-8d18263ee801" (UID: "e3fa14c7-e4f9-42fc-8972-8d18263ee801"). InnerVolumeSpecName "kube-api-access-xntqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.752881 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547774-2b2fh" Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.826437 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxhjd\" (UniqueName: \"kubernetes.io/projected/7566a1ff-1f6b-4e99-903b-ff036f98c411-kube-api-access-sxhjd\") pod \"7566a1ff-1f6b-4e99-903b-ff036f98c411\" (UID: \"7566a1ff-1f6b-4e99-903b-ff036f98c411\") " Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.827675 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xntqs\" (UniqueName: \"kubernetes.io/projected/e3fa14c7-e4f9-42fc-8972-8d18263ee801-kube-api-access-xntqs\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.830729 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7566a1ff-1f6b-4e99-903b-ff036f98c411-kube-api-access-sxhjd" (OuterVolumeSpecName: "kube-api-access-sxhjd") pod "7566a1ff-1f6b-4e99-903b-ff036f98c411" (UID: "7566a1ff-1f6b-4e99-903b-ff036f98c411"). InnerVolumeSpecName "kube-api-access-sxhjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.888696 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-5tr44" Mar 07 06:56:10 crc kubenswrapper[4941]: I0307 06:56:10.928192 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxhjd\" (UniqueName: \"kubernetes.io/projected/7566a1ff-1f6b-4e99-903b-ff036f98c411-kube-api-access-sxhjd\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:11 crc kubenswrapper[4941]: I0307 06:56:11.194328 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547774-2b2fh" event={"ID":"7566a1ff-1f6b-4e99-903b-ff036f98c411","Type":"ContainerDied","Data":"b454bce4a04d2624d7259e6a84ee526893d699dd826268c0d71c8547ea37d33f"} Mar 07 06:56:11 crc kubenswrapper[4941]: I0307 06:56:11.194428 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b454bce4a04d2624d7259e6a84ee526893d699dd826268c0d71c8547ea37d33f" Mar 07 06:56:11 crc kubenswrapper[4941]: I0307 06:56:11.194440 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547774-2b2fh" Mar 07 06:56:11 crc kubenswrapper[4941]: I0307 06:56:11.197462 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a" exitCode=0 Mar 07 06:56:11 crc kubenswrapper[4941]: I0307 06:56:11.197532 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a"} Mar 07 06:56:11 crc kubenswrapper[4941]: I0307 06:56:11.198691 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547776-g4ksl" event={"ID":"e3fa14c7-e4f9-42fc-8972-8d18263ee801","Type":"ContainerDied","Data":"5c2956e0bf25ebfee7d4dce6923f70654e20ae844f0c1890aa5adc61f53c5803"} Mar 07 06:56:11 crc kubenswrapper[4941]: I0307 06:56:11.198843 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c2956e0bf25ebfee7d4dce6923f70654e20ae844f0c1890aa5adc61f53c5803" Mar 07 06:56:11 crc kubenswrapper[4941]: I0307 06:56:11.198874 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547776-g4ksl" Mar 07 06:56:12 crc kubenswrapper[4941]: I0307 06:56:12.206114 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"7416552df2e4055270a338bb841d809844616b740ce91a60c1b321d6a4dac058"} Mar 07 06:56:14 crc kubenswrapper[4941]: I0307 06:56:14.600134 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59694df6ff-qr8hr"] Mar 07 06:56:14 crc kubenswrapper[4941]: I0307 06:56:14.601506 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" podUID="2ccd6084-ccb5-405e-ab81-d5447115d772" containerName="controller-manager" containerID="cri-o://8500e5b87a62f6b9679e076ea70f6f35bdd35bbaf6261b806c685d2a18952e17" gracePeriod=30 Mar 07 06:56:14 crc kubenswrapper[4941]: I0307 06:56:14.608958 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8"] Mar 07 06:56:14 crc kubenswrapper[4941]: I0307 06:56:14.609155 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" podUID="e3b829b8-41e3-4d54-a4c0-66338dd0d902" containerName="route-controller-manager" containerID="cri-o://c9753e27573bd0f481f3f16c5a699cf8d38254bb916ca3d392a6a7b848a01d9a" gracePeriod=30 Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.178202 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.242342 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.243102 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktp7f" event={"ID":"d003569d-8946-47e7-adf2-5148ca8de944","Type":"ContainerStarted","Data":"d7683ad9d1dce5316d63834d4de252268c91d6465c504103cf7535ac1e1040d6"} Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.248609 4941 generic.go:334] "Generic (PLEG): container finished" podID="e3b829b8-41e3-4d54-a4c0-66338dd0d902" containerID="c9753e27573bd0f481f3f16c5a699cf8d38254bb916ca3d392a6a7b848a01d9a" exitCode=0 Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.248703 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" event={"ID":"e3b829b8-41e3-4d54-a4c0-66338dd0d902","Type":"ContainerDied","Data":"c9753e27573bd0f481f3f16c5a699cf8d38254bb916ca3d392a6a7b848a01d9a"} Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.248749 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" event={"ID":"e3b829b8-41e3-4d54-a4c0-66338dd0d902","Type":"ContainerDied","Data":"87f09fb1980b1db6dbaa317cc4029a499c36460580c4fa92a1f1d2346686f95e"} Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.248774 4941 scope.go:117] "RemoveContainer" containerID="c9753e27573bd0f481f3f16c5a699cf8d38254bb916ca3d392a6a7b848a01d9a" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.248949 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.251443 4941 generic.go:334] "Generic (PLEG): container finished" podID="2ccd6084-ccb5-405e-ab81-d5447115d772" containerID="8500e5b87a62f6b9679e076ea70f6f35bdd35bbaf6261b806c685d2a18952e17" exitCode=0 Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.252609 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.252943 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" event={"ID":"2ccd6084-ccb5-405e-ab81-d5447115d772","Type":"ContainerDied","Data":"8500e5b87a62f6b9679e076ea70f6f35bdd35bbaf6261b806c685d2a18952e17"} Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.253001 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" event={"ID":"2ccd6084-ccb5-405e-ab81-d5447115d772","Type":"ContainerDied","Data":"f7558d1952c5804d6b280746acdfec0467da19a325b132bd4943f89c3068f784"} Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.279115 4941 scope.go:117] "RemoveContainer" containerID="c9753e27573bd0f481f3f16c5a699cf8d38254bb916ca3d392a6a7b848a01d9a" Mar 07 06:56:15 crc kubenswrapper[4941]: E0307 06:56:15.279721 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9753e27573bd0f481f3f16c5a699cf8d38254bb916ca3d392a6a7b848a01d9a\": container with ID starting with c9753e27573bd0f481f3f16c5a699cf8d38254bb916ca3d392a6a7b848a01d9a not found: ID does not exist" containerID="c9753e27573bd0f481f3f16c5a699cf8d38254bb916ca3d392a6a7b848a01d9a" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.279759 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9753e27573bd0f481f3f16c5a699cf8d38254bb916ca3d392a6a7b848a01d9a"} err="failed to get container status \"c9753e27573bd0f481f3f16c5a699cf8d38254bb916ca3d392a6a7b848a01d9a\": rpc error: code = NotFound desc = could not find container \"c9753e27573bd0f481f3f16c5a699cf8d38254bb916ca3d392a6a7b848a01d9a\": container with ID starting with c9753e27573bd0f481f3f16c5a699cf8d38254bb916ca3d392a6a7b848a01d9a not found: ID does not exist" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.279782 4941 scope.go:117] "RemoveContainer" containerID="8500e5b87a62f6b9679e076ea70f6f35bdd35bbaf6261b806c685d2a18952e17" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.285980 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcg99\" (UniqueName: \"kubernetes.io/projected/e3b829b8-41e3-4d54-a4c0-66338dd0d902-kube-api-access-rcg99\") pod \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.286036 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b829b8-41e3-4d54-a4c0-66338dd0d902-serving-cert\") pod \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.286128 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b829b8-41e3-4d54-a4c0-66338dd0d902-config\") pod \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.286163 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b829b8-41e3-4d54-a4c0-66338dd0d902-client-ca\") pod \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\" (UID: \"e3b829b8-41e3-4d54-a4c0-66338dd0d902\") " Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.287490 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b829b8-41e3-4d54-a4c0-66338dd0d902-client-ca" (OuterVolumeSpecName: "client-ca") pod "e3b829b8-41e3-4d54-a4c0-66338dd0d902" (UID: "e3b829b8-41e3-4d54-a4c0-66338dd0d902"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.287598 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b829b8-41e3-4d54-a4c0-66338dd0d902-config" (OuterVolumeSpecName: "config") pod "e3b829b8-41e3-4d54-a4c0-66338dd0d902" (UID: "e3b829b8-41e3-4d54-a4c0-66338dd0d902"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.295762 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b829b8-41e3-4d54-a4c0-66338dd0d902-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e3b829b8-41e3-4d54-a4c0-66338dd0d902" (UID: "e3b829b8-41e3-4d54-a4c0-66338dd0d902"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.296796 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b829b8-41e3-4d54-a4c0-66338dd0d902-kube-api-access-rcg99" (OuterVolumeSpecName: "kube-api-access-rcg99") pod "e3b829b8-41e3-4d54-a4c0-66338dd0d902" (UID: "e3b829b8-41e3-4d54-a4c0-66338dd0d902"). InnerVolumeSpecName "kube-api-access-rcg99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.298900 4941 scope.go:117] "RemoveContainer" containerID="8500e5b87a62f6b9679e076ea70f6f35bdd35bbaf6261b806c685d2a18952e17" Mar 07 06:56:15 crc kubenswrapper[4941]: E0307 06:56:15.299599 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8500e5b87a62f6b9679e076ea70f6f35bdd35bbaf6261b806c685d2a18952e17\": container with ID starting with 8500e5b87a62f6b9679e076ea70f6f35bdd35bbaf6261b806c685d2a18952e17 not found: ID does not exist" containerID="8500e5b87a62f6b9679e076ea70f6f35bdd35bbaf6261b806c685d2a18952e17" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.299646 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8500e5b87a62f6b9679e076ea70f6f35bdd35bbaf6261b806c685d2a18952e17"} err="failed to get container status \"8500e5b87a62f6b9679e076ea70f6f35bdd35bbaf6261b806c685d2a18952e17\": rpc error: code = NotFound desc = could not find container \"8500e5b87a62f6b9679e076ea70f6f35bdd35bbaf6261b806c685d2a18952e17\": container with ID starting with 8500e5b87a62f6b9679e076ea70f6f35bdd35bbaf6261b806c685d2a18952e17 not found: ID does not exist" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.387596 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-client-ca\") pod \"2ccd6084-ccb5-405e-ab81-d5447115d772\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.387662 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-proxy-ca-bundles\") pod \"2ccd6084-ccb5-405e-ab81-d5447115d772\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.387738 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-config\") pod \"2ccd6084-ccb5-405e-ab81-d5447115d772\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.387802 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv88f\" (UniqueName: \"kubernetes.io/projected/2ccd6084-ccb5-405e-ab81-d5447115d772-kube-api-access-hv88f\") pod \"2ccd6084-ccb5-405e-ab81-d5447115d772\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.387855 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ccd6084-ccb5-405e-ab81-d5447115d772-serving-cert\") pod \"2ccd6084-ccb5-405e-ab81-d5447115d772\" (UID: \"2ccd6084-ccb5-405e-ab81-d5447115d772\") " Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.388232 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b829b8-41e3-4d54-a4c0-66338dd0d902-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.388249 4941 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b829b8-41e3-4d54-a4c0-66338dd0d902-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.388262 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcg99\" (UniqueName: \"kubernetes.io/projected/e3b829b8-41e3-4d54-a4c0-66338dd0d902-kube-api-access-rcg99\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.388272 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b829b8-41e3-4d54-a4c0-66338dd0d902-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.388748 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-client-ca" (OuterVolumeSpecName: "client-ca") pod "2ccd6084-ccb5-405e-ab81-d5447115d772" (UID: "2ccd6084-ccb5-405e-ab81-d5447115d772"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.388882 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2ccd6084-ccb5-405e-ab81-d5447115d772" (UID: "2ccd6084-ccb5-405e-ab81-d5447115d772"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.388863 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-config" (OuterVolumeSpecName: "config") pod "2ccd6084-ccb5-405e-ab81-d5447115d772" (UID: "2ccd6084-ccb5-405e-ab81-d5447115d772"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.391128 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ccd6084-ccb5-405e-ab81-d5447115d772-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2ccd6084-ccb5-405e-ab81-d5447115d772" (UID: "2ccd6084-ccb5-405e-ab81-d5447115d772"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.391350 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ccd6084-ccb5-405e-ab81-d5447115d772-kube-api-access-hv88f" (OuterVolumeSpecName: "kube-api-access-hv88f") pod "2ccd6084-ccb5-405e-ab81-d5447115d772" (UID: "2ccd6084-ccb5-405e-ab81-d5447115d772"). InnerVolumeSpecName "kube-api-access-hv88f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.489749 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv88f\" (UniqueName: \"kubernetes.io/projected/2ccd6084-ccb5-405e-ab81-d5447115d772-kube-api-access-hv88f\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.489796 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ccd6084-ccb5-405e-ab81-d5447115d772-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.489810 4941 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.489819 4941 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.489828 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ccd6084-ccb5-405e-ab81-d5447115d772-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.594879 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59694df6ff-qr8hr"] Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.602256 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-59694df6ff-qr8hr"] Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.605514 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8"] Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.609903 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-848dff8976-xv9k8"] Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.964375 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ccd6084-ccb5-405e-ab81-d5447115d772" path="/var/lib/kubelet/pods/2ccd6084-ccb5-405e-ab81-d5447115d772/volumes" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.965002 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b829b8-41e3-4d54-a4c0-66338dd0d902" path="/var/lib/kubelet/pods/e3b829b8-41e3-4d54-a4c0-66338dd0d902/volumes" Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.976909 4941 patch_prober.go:28] interesting pod/controller-manager-59694df6ff-qr8hr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 06:56:15 crc kubenswrapper[4941]: I0307 06:56:15.976972 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-59694df6ff-qr8hr" podUID="2ccd6084-ccb5-405e-ab81-d5447115d772" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.008179 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86495f659f-6xnjp"] Mar 07 06:56:16 crc kubenswrapper[4941]: E0307 06:56:16.008475 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2ec596-b148-426d-a94f-b64539627393" containerName="pruner" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.008490 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2ec596-b148-426d-a94f-b64539627393" containerName="pruner" Mar 07 06:56:16 crc kubenswrapper[4941]: E0307 06:56:16.008505 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ccd6084-ccb5-405e-ab81-d5447115d772" containerName="controller-manager" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.008512 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ccd6084-ccb5-405e-ab81-d5447115d772" containerName="controller-manager" Mar 07 06:56:16 crc kubenswrapper[4941]: E0307 06:56:16.008523 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b829b8-41e3-4d54-a4c0-66338dd0d902" containerName="route-controller-manager" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.008531 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b829b8-41e3-4d54-a4c0-66338dd0d902" containerName="route-controller-manager" Mar 07 06:56:16 crc kubenswrapper[4941]: E0307 06:56:16.008538 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3fa14c7-e4f9-42fc-8972-8d18263ee801" containerName="oc" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.008545 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3fa14c7-e4f9-42fc-8972-8d18263ee801" containerName="oc" Mar 07 06:56:16 crc kubenswrapper[4941]: E0307 06:56:16.008556 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7566a1ff-1f6b-4e99-903b-ff036f98c411" containerName="oc" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.008561 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="7566a1ff-1f6b-4e99-903b-ff036f98c411" containerName="oc" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.008666 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ccd6084-ccb5-405e-ab81-d5447115d772" containerName="controller-manager" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.008679 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2ec596-b148-426d-a94f-b64539627393" containerName="pruner" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.008690 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="7566a1ff-1f6b-4e99-903b-ff036f98c411" containerName="oc" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.008708 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3fa14c7-e4f9-42fc-8972-8d18263ee801" containerName="oc" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.008716 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b829b8-41e3-4d54-a4c0-66338dd0d902" containerName="route-controller-manager" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.009129 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.011423 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.011924 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.012130 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.012258 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.012456 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.015770 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.037998 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277"] Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.038996 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.039176 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.046275 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86495f659f-6xnjp"] Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.047131 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.047564 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.047837 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.048666 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.052031 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.052312 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.072166 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277"] Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.097137 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75569bd2-5c18-4a4d-9545-210a324a4b7a-serving-cert\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.097234 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-proxy-ca-bundles\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.097854 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-client-ca\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.097992 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w9zx\" (UniqueName: \"kubernetes.io/projected/75569bd2-5c18-4a4d-9545-210a324a4b7a-kube-api-access-6w9zx\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.098474 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-config\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.199807 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-config\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.199882 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3034b0d0-6866-4391-ac47-43bd5312adb1-client-ca\") pod \"route-controller-manager-57778f6ddf-cx277\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.199913 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75569bd2-5c18-4a4d-9545-210a324a4b7a-serving-cert\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.199940 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3034b0d0-6866-4391-ac47-43bd5312adb1-config\") pod \"route-controller-manager-57778f6ddf-cx277\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.199960 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-proxy-ca-bundles\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.200004 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4rnh\" (UniqueName: \"kubernetes.io/projected/3034b0d0-6866-4391-ac47-43bd5312adb1-kube-api-access-k4rnh\") pod \"route-controller-manager-57778f6ddf-cx277\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.200033 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-client-ca\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.200056 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3034b0d0-6866-4391-ac47-43bd5312adb1-serving-cert\") pod \"route-controller-manager-57778f6ddf-cx277\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.200081 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w9zx\" (UniqueName: \"kubernetes.io/projected/75569bd2-5c18-4a4d-9545-210a324a4b7a-kube-api-access-6w9zx\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.201642 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-client-ca\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.201774 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-proxy-ca-bundles\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.201655 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-config\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.207464 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75569bd2-5c18-4a4d-9545-210a324a4b7a-serving-cert\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.220026 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w9zx\" (UniqueName: \"kubernetes.io/projected/75569bd2-5c18-4a4d-9545-210a324a4b7a-kube-api-access-6w9zx\") pod \"controller-manager-86495f659f-6xnjp\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.267849 4941 generic.go:334] "Generic (PLEG): container finished" podID="d003569d-8946-47e7-adf2-5148ca8de944" containerID="d7683ad9d1dce5316d63834d4de252268c91d6465c504103cf7535ac1e1040d6" exitCode=0 Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.267930 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktp7f" event={"ID":"d003569d-8946-47e7-adf2-5148ca8de944","Type":"ContainerDied","Data":"d7683ad9d1dce5316d63834d4de252268c91d6465c504103cf7535ac1e1040d6"} Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.301121 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4rnh\" (UniqueName: \"kubernetes.io/projected/3034b0d0-6866-4391-ac47-43bd5312adb1-kube-api-access-k4rnh\") pod \"route-controller-manager-57778f6ddf-cx277\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.301620 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3034b0d0-6866-4391-ac47-43bd5312adb1-serving-cert\") pod \"route-controller-manager-57778f6ddf-cx277\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.301677 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3034b0d0-6866-4391-ac47-43bd5312adb1-client-ca\") pod \"route-controller-manager-57778f6ddf-cx277\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.301717 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3034b0d0-6866-4391-ac47-43bd5312adb1-config\") pod \"route-controller-manager-57778f6ddf-cx277\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.302901 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3034b0d0-6866-4391-ac47-43bd5312adb1-config\") pod \"route-controller-manager-57778f6ddf-cx277\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.303132 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3034b0d0-6866-4391-ac47-43bd5312adb1-client-ca\") pod \"route-controller-manager-57778f6ddf-cx277\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.307233 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3034b0d0-6866-4391-ac47-43bd5312adb1-serving-cert\") pod \"route-controller-manager-57778f6ddf-cx277\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.318547 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4rnh\" (UniqueName: \"kubernetes.io/projected/3034b0d0-6866-4391-ac47-43bd5312adb1-kube-api-access-k4rnh\") pod \"route-controller-manager-57778f6ddf-cx277\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.340335 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.358930 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.779966 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277"] Mar 07 06:56:16 crc kubenswrapper[4941]: I0307 06:56:16.855112 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86495f659f-6xnjp"] Mar 07 06:56:17 crc kubenswrapper[4941]: I0307 06:56:17.277429 4941 generic.go:334] "Generic (PLEG): container finished" podID="bdb71b40-ad9b-405b-a178-158109d65a92" containerID="43eab2b87765e736865e75c97c7f4a343f30404449579af0a35b9ccac4bbf877" exitCode=0 Mar 07 06:56:17 crc kubenswrapper[4941]: I0307 06:56:17.277510 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qx5dk" event={"ID":"bdb71b40-ad9b-405b-a178-158109d65a92","Type":"ContainerDied","Data":"43eab2b87765e736865e75c97c7f4a343f30404449579af0a35b9ccac4bbf877"} Mar 07 06:56:17 crc kubenswrapper[4941]: I0307 06:56:17.280837 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktp7f" event={"ID":"d003569d-8946-47e7-adf2-5148ca8de944","Type":"ContainerStarted","Data":"131db25e24bc1e94bca46dfac0d77253c12d4fb72dfb909ae71564be9fafd63d"} Mar 07 06:56:18 crc kubenswrapper[4941]: I0307 06:56:18.290265 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" event={"ID":"75569bd2-5c18-4a4d-9545-210a324a4b7a","Type":"ContainerStarted","Data":"b72ea7fa8f146afe4eb6bdbfecb5c5aea96475063341d60752ee2efe30522b26"} Mar 07 06:56:18 crc kubenswrapper[4941]: I0307 06:56:18.293361 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" event={"ID":"3034b0d0-6866-4391-ac47-43bd5312adb1","Type":"ContainerStarted","Data":"f1ba742c00b0b30829ae47befd1eafabdd15de2ea8bed6826082e505b3aee2dd"} Mar 07 06:56:18 crc kubenswrapper[4941]: I0307 06:56:18.314603 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ktp7f" podStartSLOduration=4.170032853 podStartE2EDuration="1m0.314584341s" podCreationTimestamp="2026-03-07 06:55:18 +0000 UTC" firstStartedPulling="2026-03-07 06:55:20.533141008 +0000 UTC m=+217.485506473" lastFinishedPulling="2026-03-07 06:56:16.677692496 +0000 UTC m=+273.630057961" observedRunningTime="2026-03-07 06:56:18.313317916 +0000 UTC m=+275.265683401" watchObservedRunningTime="2026-03-07 06:56:18.314584341 +0000 UTC m=+275.266949816" Mar 07 06:56:18 crc kubenswrapper[4941]: I0307 06:56:18.842316 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:56:18 crc kubenswrapper[4941]: I0307 06:56:18.842557 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:56:20 crc kubenswrapper[4941]: I0307 06:56:20.616748 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ktp7f" podUID="d003569d-8946-47e7-adf2-5148ca8de944" containerName="registry-server" probeResult="failure" output=< Mar 07 06:56:20 crc kubenswrapper[4941]: timeout: failed to connect service ":50051" within 1s Mar 07 06:56:20 crc kubenswrapper[4941]: > Mar 07 06:56:25 crc kubenswrapper[4941]: I0307 06:56:23.867536 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" podUID="612ac789-5007-4e17-a81a-cf753c2acadc" containerName="oauth-openshift" containerID="cri-o://c06b09c5153c27b50b6ce6200c22c54548df3ddd58ab14869b8f22d23710cb0b" gracePeriod=15 Mar 07 06:56:25 crc kubenswrapper[4941]: I0307 06:56:25.342143 4941 generic.go:334] "Generic (PLEG): container finished" podID="612ac789-5007-4e17-a81a-cf753c2acadc" containerID="c06b09c5153c27b50b6ce6200c22c54548df3ddd58ab14869b8f22d23710cb0b" exitCode=0 Mar 07 06:56:25 crc kubenswrapper[4941]: I0307 06:56:25.342190 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" event={"ID":"612ac789-5007-4e17-a81a-cf753c2acadc","Type":"ContainerDied","Data":"c06b09c5153c27b50b6ce6200c22c54548df3ddd58ab14869b8f22d23710cb0b"} Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.369234 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" event={"ID":"3034b0d0-6866-4391-ac47-43bd5312adb1","Type":"ContainerStarted","Data":"29c439e1e6aac1934093b4cb7638023a26d4a89f0d1676bb0f6234ec5a9b7f66"} Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.370193 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.376811 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" event={"ID":"75569bd2-5c18-4a4d-9545-210a324a4b7a","Type":"ContainerStarted","Data":"a083e23f156d4f42ac69007c05a3ea402a6501e69ad7110eaa2e8dc0531af0e6"} Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.377584 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.381580 4941 patch_prober.go:28] interesting pod/controller-manager-86495f659f-6xnjp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.381654 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" podUID="75569bd2-5c18-4a4d-9545-210a324a4b7a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.385674 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.398571 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" podStartSLOduration=13.398541118 podStartE2EDuration="13.398541118s" podCreationTimestamp="2026-03-07 06:56:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:56:27.39276975 +0000 UTC m=+284.345135225" watchObservedRunningTime="2026-03-07 06:56:27.398541118 +0000 UTC m=+284.350906583" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.464234 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" podStartSLOduration=13.46419844 podStartE2EDuration="13.46419844s" podCreationTimestamp="2026-03-07 06:56:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:56:27.420016294 +0000 UTC m=+284.372381759" watchObservedRunningTime="2026-03-07 06:56:27.46419844 +0000 UTC m=+284.416563905" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.514105 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-service-ca\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.514887 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-session\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.514918 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r28vr\" (UniqueName: \"kubernetes.io/projected/612ac789-5007-4e17-a81a-cf753c2acadc-kube-api-access-r28vr\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.514977 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-idp-0-file-data\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.515024 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-provider-selection\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.515063 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-cliconfig\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.515106 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-serving-cert\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.515138 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-ocp-branding-template\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.515177 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-audit-policies\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.515203 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/612ac789-5007-4e17-a81a-cf753c2acadc-audit-dir\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.515231 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-error\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.515270 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-trusted-ca-bundle\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.515330 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-router-certs\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.515359 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-login\") pod \"612ac789-5007-4e17-a81a-cf753c2acadc\" (UID: \"612ac789-5007-4e17-a81a-cf753c2acadc\") " Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.521612 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.524459 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/612ac789-5007-4e17-a81a-cf753c2acadc-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.525536 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.525584 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.527554 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.531329 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.533232 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.534049 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/612ac789-5007-4e17-a81a-cf753c2acadc-kube-api-access-r28vr" (OuterVolumeSpecName: "kube-api-access-r28vr") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "kube-api-access-r28vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.534676 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.535621 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.536116 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.539006 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.541089 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.541352 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "612ac789-5007-4e17-a81a-cf753c2acadc" (UID: "612ac789-5007-4e17-a81a-cf753c2acadc"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.616581 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.617124 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.617141 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.617159 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.617175 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r28vr\" (UniqueName: \"kubernetes.io/projected/612ac789-5007-4e17-a81a-cf753c2acadc-kube-api-access-r28vr\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.617187 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.617200 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.617217 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.617229 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.617241 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.617253 4941 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.617265 4941 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/612ac789-5007-4e17-a81a-cf753c2acadc-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.617277 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.617288 4941 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/612ac789-5007-4e17-a81a-cf753c2acadc-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:27 crc kubenswrapper[4941]: I0307 06:56:27.776346 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.403973 4941 generic.go:334] "Generic (PLEG): container finished" podID="805b56ac-66fd-4704-adb1-f3968f17f835" containerID="d9c269d4a8b1f56ee0402d644bcbe0748bf9b3ad684452bbd07a64c7dd3e9c57" exitCode=0 Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.404066 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4zqp" event={"ID":"805b56ac-66fd-4704-adb1-f3968f17f835","Type":"ContainerDied","Data":"d9c269d4a8b1f56ee0402d644bcbe0748bf9b3ad684452bbd07a64c7dd3e9c57"} Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.408815 4941 generic.go:334] "Generic (PLEG): container finished" podID="a10e3708-a476-4698-aa8d-ba99a795524a" containerID="021eb75d33ce6a9734e651ac4fb1cca81f1441e309586915904911dc8dbb5b37" exitCode=0 Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.408869 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v5x2" event={"ID":"a10e3708-a476-4698-aa8d-ba99a795524a","Type":"ContainerDied","Data":"021eb75d33ce6a9734e651ac4fb1cca81f1441e309586915904911dc8dbb5b37"} Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.415509 4941 generic.go:334] "Generic (PLEG): container finished" podID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" containerID="c3be598c099f0409cedbbad181bc9ccca85cdb3724d2a1570c21ff57316be0fa" exitCode=0 Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.415598 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsszj" event={"ID":"715e8d60-13c8-442f-bec0-2f2fd1cfe172","Type":"ContainerDied","Data":"c3be598c099f0409cedbbad181bc9ccca85cdb3724d2a1570c21ff57316be0fa"} Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.425978 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g4dmh" event={"ID":"36212ca9-755e-4104-a203-7c136afbfca9","Type":"ContainerStarted","Data":"172a5846ac2238be5a04b1a4c39628b678676e1d9cab2409d32d6c0195b72ba2"} Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.431838 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" event={"ID":"612ac789-5007-4e17-a81a-cf753c2acadc","Type":"ContainerDied","Data":"f02bd535e53f3cac720e0f6f703bc17d02b580ca432a546c37b4c30530e2125a"} Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.431890 4941 scope.go:117] "RemoveContainer" containerID="c06b09c5153c27b50b6ce6200c22c54548df3ddd58ab14869b8f22d23710cb0b" Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.432019 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8n7vr" Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.436666 4941 generic.go:334] "Generic (PLEG): container finished" podID="86719fee-4b62-4f53-958e-9e87f56a9062" containerID="cf43840be0d0d7343c046ab925a4019b3c8dfe6aa43ef094d4a19d712fdada0d" exitCode=0 Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.436754 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jg549" event={"ID":"86719fee-4b62-4f53-958e-9e87f56a9062","Type":"ContainerDied","Data":"cf43840be0d0d7343c046ab925a4019b3c8dfe6aa43ef094d4a19d712fdada0d"} Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.447853 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qx5dk" event={"ID":"bdb71b40-ad9b-405b-a178-158109d65a92","Type":"ContainerStarted","Data":"e004e0f089b7ba7eb1801ec7f91f0a05195695ac76747a0d6e8e7433e09dcb2b"} Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.456308 4941 generic.go:334] "Generic (PLEG): container finished" podID="37622fc0-c5dc-4e0d-848a-214bce293f7f" containerID="9e8a66df85863050c4fb353c15a57d00c282e6a6a9c6309060302c32b3426a34" exitCode=0 Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.456472 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtxk7" event={"ID":"37622fc0-c5dc-4e0d-848a-214bce293f7f","Type":"ContainerDied","Data":"9e8a66df85863050c4fb353c15a57d00c282e6a6a9c6309060302c32b3426a34"} Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.462267 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.513810 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8n7vr"] Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.521105 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8n7vr"] Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.594241 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qx5dk" podStartSLOduration=4.847991405 podStartE2EDuration="1m13.594220141s" podCreationTimestamp="2026-03-07 06:55:15 +0000 UTC" firstStartedPulling="2026-03-07 06:55:18.395641507 +0000 UTC m=+215.348006982" lastFinishedPulling="2026-03-07 06:56:27.141870253 +0000 UTC m=+284.094235718" observedRunningTime="2026-03-07 06:56:28.592215156 +0000 UTC m=+285.544580621" watchObservedRunningTime="2026-03-07 06:56:28.594220141 +0000 UTC m=+285.546585596" Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.904149 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:56:28 crc kubenswrapper[4941]: I0307 06:56:28.951872 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.488264 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jg549" event={"ID":"86719fee-4b62-4f53-958e-9e87f56a9062","Type":"ContainerStarted","Data":"0e91ec8e8163e781edeba040ac125e6490e63867c8325e10d9130472c7b739c2"} Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.503743 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtxk7" event={"ID":"37622fc0-c5dc-4e0d-848a-214bce293f7f","Type":"ContainerStarted","Data":"3a34d313b7cc6c5ec835038db8ec4ce254948654c6dc501cf13d7b2cf6696171"} Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.510077 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4zqp" event={"ID":"805b56ac-66fd-4704-adb1-f3968f17f835","Type":"ContainerStarted","Data":"7bf7f4695cd43c0565ac5f6b74ad1e3889ca846037a49295da8a36a319c304e1"} Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.518238 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v5x2" event={"ID":"a10e3708-a476-4698-aa8d-ba99a795524a","Type":"ContainerStarted","Data":"61051f5c8b80c92ee631a503d514c20a48781547bdf5fd70e1ba8264116986fa"} Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.519674 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jg549" podStartSLOduration=4.02329355 podStartE2EDuration="1m14.519647587s" podCreationTimestamp="2026-03-07 06:55:15 +0000 UTC" firstStartedPulling="2026-03-07 06:55:18.36702992 +0000 UTC m=+215.319395385" lastFinishedPulling="2026-03-07 06:56:28.863383957 +0000 UTC m=+285.815749422" observedRunningTime="2026-03-07 06:56:29.519634987 +0000 UTC m=+286.472000452" watchObservedRunningTime="2026-03-07 06:56:29.519647587 +0000 UTC m=+286.472013052" Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.525570 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsszj" event={"ID":"715e8d60-13c8-442f-bec0-2f2fd1cfe172","Type":"ContainerStarted","Data":"ad1fec307fe9dff157fe4ba6fc0bc60c95a15d31df170fb025518474f82dbef7"} Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.535002 4941 generic.go:334] "Generic (PLEG): container finished" podID="36212ca9-755e-4104-a203-7c136afbfca9" containerID="172a5846ac2238be5a04b1a4c39628b678676e1d9cab2409d32d6c0195b72ba2" exitCode=0 Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.535164 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g4dmh" event={"ID":"36212ca9-755e-4104-a203-7c136afbfca9","Type":"ContainerDied","Data":"172a5846ac2238be5a04b1a4c39628b678676e1d9cab2409d32d6c0195b72ba2"} Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.535204 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g4dmh" event={"ID":"36212ca9-755e-4104-a203-7c136afbfca9","Type":"ContainerStarted","Data":"c336ec75fb6e373e93af422f7d66b047b8f9175277903903ad6a67be085eb0fe"} Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.563894 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5v5x2" podStartSLOduration=3.039456454 podStartE2EDuration="1m12.563867364s" podCreationTimestamp="2026-03-07 06:55:17 +0000 UTC" firstStartedPulling="2026-03-07 06:55:19.513625603 +0000 UTC m=+216.465991068" lastFinishedPulling="2026-03-07 06:56:29.038036523 +0000 UTC m=+285.990401978" observedRunningTime="2026-03-07 06:56:29.561049217 +0000 UTC m=+286.513414682" watchObservedRunningTime="2026-03-07 06:56:29.563867364 +0000 UTC m=+286.516232829" Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.593093 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z4zqp" podStartSLOduration=4.092551955 podStartE2EDuration="1m14.593075241s" podCreationTimestamp="2026-03-07 06:55:15 +0000 UTC" firstStartedPulling="2026-03-07 06:55:18.427593037 +0000 UTC m=+215.379958502" lastFinishedPulling="2026-03-07 06:56:28.928116333 +0000 UTC m=+285.880481788" observedRunningTime="2026-03-07 06:56:29.589290418 +0000 UTC m=+286.541655883" watchObservedRunningTime="2026-03-07 06:56:29.593075241 +0000 UTC m=+286.545440706" Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.616355 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vtxk7" podStartSLOduration=4.27067721 podStartE2EDuration="1m12.616258214s" podCreationTimestamp="2026-03-07 06:55:17 +0000 UTC" firstStartedPulling="2026-03-07 06:55:20.5924435 +0000 UTC m=+217.544808965" lastFinishedPulling="2026-03-07 06:56:28.938024504 +0000 UTC m=+285.890389969" observedRunningTime="2026-03-07 06:56:29.614691141 +0000 UTC m=+286.567056606" watchObservedRunningTime="2026-03-07 06:56:29.616258214 +0000 UTC m=+286.568623679" Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.637966 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g4dmh" podStartSLOduration=3.137367084 podStartE2EDuration="1m11.637940596s" podCreationTimestamp="2026-03-07 06:55:18 +0000 UTC" firstStartedPulling="2026-03-07 06:55:20.578131111 +0000 UTC m=+217.530496576" lastFinishedPulling="2026-03-07 06:56:29.078704623 +0000 UTC m=+286.031070088" observedRunningTime="2026-03-07 06:56:29.63517088 +0000 UTC m=+286.587536345" watchObservedRunningTime="2026-03-07 06:56:29.637940596 +0000 UTC m=+286.590306061" Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.654730 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xsszj" podStartSLOduration=4.127941169 podStartE2EDuration="1m14.654703103s" podCreationTimestamp="2026-03-07 06:55:15 +0000 UTC" firstStartedPulling="2026-03-07 06:55:18.299456638 +0000 UTC m=+215.251822103" lastFinishedPulling="2026-03-07 06:56:28.826218572 +0000 UTC m=+285.778584037" observedRunningTime="2026-03-07 06:56:29.652868733 +0000 UTC m=+286.605234208" watchObservedRunningTime="2026-03-07 06:56:29.654703103 +0000 UTC m=+286.607068578" Mar 07 06:56:29 crc kubenswrapper[4941]: I0307 06:56:29.961122 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="612ac789-5007-4e17-a81a-cf753c2acadc" path="/var/lib/kubelet/pods/612ac789-5007-4e17-a81a-cf753c2acadc/volumes" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.145036 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx"] Mar 07 06:56:31 crc kubenswrapper[4941]: E0307 06:56:31.145633 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="612ac789-5007-4e17-a81a-cf753c2acadc" containerName="oauth-openshift" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.145648 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="612ac789-5007-4e17-a81a-cf753c2acadc" containerName="oauth-openshift" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.145779 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="612ac789-5007-4e17-a81a-cf753c2acadc" containerName="oauth-openshift" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.146264 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.153176 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.153790 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.154670 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.154836 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.154981 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.155111 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.162821 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.162870 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.165057 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.165139 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.165141 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.165837 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.172818 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.173148 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx"] Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.182633 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.191813 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.336133 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.336206 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-session\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.336244 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-cliconfig\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.336319 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a0eab0a-d3e5-4552-9897-d5b25e04a667-audit-policies\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.336345 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb6xw\" (UniqueName: \"kubernetes.io/projected/5a0eab0a-d3e5-4552-9897-d5b25e04a667-kube-api-access-hb6xw\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.336368 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.336392 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-router-certs\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.336429 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-user-template-login\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.336455 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-service-ca\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.336475 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-user-template-error\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.336497 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.336644 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-serving-cert\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.336718 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a0eab0a-d3e5-4552-9897-d5b25e04a667-audit-dir\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.337083 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.437916 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-service-ca\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.437980 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-user-template-error\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.438014 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.438091 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-serving-cert\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.438816 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-service-ca\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.439360 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a0eab0a-d3e5-4552-9897-d5b25e04a667-audit-dir\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.439450 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.439493 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.439527 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-session\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.439554 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-cliconfig\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.439581 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a0eab0a-d3e5-4552-9897-d5b25e04a667-audit-policies\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.439613 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb6xw\" (UniqueName: \"kubernetes.io/projected/5a0eab0a-d3e5-4552-9897-d5b25e04a667-kube-api-access-hb6xw\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.439644 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.439676 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-user-template-login\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.439703 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-router-certs\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.441007 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-cliconfig\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.441064 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a0eab0a-d3e5-4552-9897-d5b25e04a667-audit-dir\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.444671 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.445522 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a0eab0a-d3e5-4552-9897-d5b25e04a667-audit-policies\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.447790 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-router-certs\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.447911 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-serving-cert\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.448081 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.448256 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.449390 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.461185 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-system-session\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.465332 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-user-template-error\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.469140 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb6xw\" (UniqueName: \"kubernetes.io/projected/5a0eab0a-d3e5-4552-9897-d5b25e04a667-kube-api-access-hb6xw\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.469964 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5a0eab0a-d3e5-4552-9897-d5b25e04a667-v4-0-config-user-template-login\") pod \"oauth-openshift-64c9cdcbb9-qs4cx\" (UID: \"5a0eab0a-d3e5-4552-9897-d5b25e04a667\") " pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.476045 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:31 crc kubenswrapper[4941]: I0307 06:56:31.945485 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx"] Mar 07 06:56:32 crc kubenswrapper[4941]: I0307 06:56:32.564708 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" event={"ID":"5a0eab0a-d3e5-4552-9897-d5b25e04a667","Type":"ContainerStarted","Data":"03670ff975b69c20e75d2ccccbc695e98de5d1cc5e65550fc5a7a12a370f785d"} Mar 07 06:56:32 crc kubenswrapper[4941]: I0307 06:56:32.565118 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:32 crc kubenswrapper[4941]: I0307 06:56:32.565131 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" event={"ID":"5a0eab0a-d3e5-4552-9897-d5b25e04a667","Type":"ContainerStarted","Data":"969ada5cefaac69d4f02bf47afd85d228eb4d75466432035a66c887c2ed9563a"} Mar 07 06:56:32 crc kubenswrapper[4941]: I0307 06:56:32.589627 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" podStartSLOduration=34.589607712 podStartE2EDuration="34.589607712s" podCreationTimestamp="2026-03-07 06:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:56:32.586856687 +0000 UTC m=+289.539222152" watchObservedRunningTime="2026-03-07 06:56:32.589607712 +0000 UTC m=+289.541973177" Mar 07 06:56:33 crc kubenswrapper[4941]: I0307 06:56:33.014817 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-64c9cdcbb9-qs4cx" Mar 07 06:56:34 crc kubenswrapper[4941]: I0307 06:56:34.650190 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86495f659f-6xnjp"] Mar 07 06:56:34 crc kubenswrapper[4941]: I0307 06:56:34.650424 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" podUID="75569bd2-5c18-4a4d-9545-210a324a4b7a" containerName="controller-manager" containerID="cri-o://a083e23f156d4f42ac69007c05a3ea402a6501e69ad7110eaa2e8dc0531af0e6" gracePeriod=30 Mar 07 06:56:34 crc kubenswrapper[4941]: I0307 06:56:34.773239 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277"] Mar 07 06:56:34 crc kubenswrapper[4941]: I0307 06:56:34.773609 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" podUID="3034b0d0-6866-4391-ac47-43bd5312adb1" containerName="route-controller-manager" containerID="cri-o://29c439e1e6aac1934093b4cb7638023a26d4a89f0d1676bb0f6234ec5a9b7f66" gracePeriod=30 Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.588769 4941 generic.go:334] "Generic (PLEG): container finished" podID="3034b0d0-6866-4391-ac47-43bd5312adb1" containerID="29c439e1e6aac1934093b4cb7638023a26d4a89f0d1676bb0f6234ec5a9b7f66" exitCode=0 Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.588877 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" event={"ID":"3034b0d0-6866-4391-ac47-43bd5312adb1","Type":"ContainerDied","Data":"29c439e1e6aac1934093b4cb7638023a26d4a89f0d1676bb0f6234ec5a9b7f66"} Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.591649 4941 generic.go:334] "Generic (PLEG): container finished" podID="75569bd2-5c18-4a4d-9545-210a324a4b7a" containerID="a083e23f156d4f42ac69007c05a3ea402a6501e69ad7110eaa2e8dc0531af0e6" exitCode=0 Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.591697 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" event={"ID":"75569bd2-5c18-4a4d-9545-210a324a4b7a","Type":"ContainerDied","Data":"a083e23f156d4f42ac69007c05a3ea402a6501e69ad7110eaa2e8dc0531af0e6"} Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.603585 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.603653 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.682451 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.796813 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.796878 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.841488 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.843216 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.893008 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm"] Mar 07 06:56:35 crc kubenswrapper[4941]: E0307 06:56:35.893736 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3034b0d0-6866-4391-ac47-43bd5312adb1" containerName="route-controller-manager" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.893833 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="3034b0d0-6866-4391-ac47-43bd5312adb1" containerName="route-controller-manager" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.894093 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="3034b0d0-6866-4391-ac47-43bd5312adb1" containerName="route-controller-manager" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.894712 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.906015 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm"] Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.911046 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3034b0d0-6866-4391-ac47-43bd5312adb1-serving-cert\") pod \"3034b0d0-6866-4391-ac47-43bd5312adb1\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.911348 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4rnh\" (UniqueName: \"kubernetes.io/projected/3034b0d0-6866-4391-ac47-43bd5312adb1-kube-api-access-k4rnh\") pod \"3034b0d0-6866-4391-ac47-43bd5312adb1\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.911490 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3034b0d0-6866-4391-ac47-43bd5312adb1-client-ca\") pod \"3034b0d0-6866-4391-ac47-43bd5312adb1\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.911613 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3034b0d0-6866-4391-ac47-43bd5312adb1-config\") pod \"3034b0d0-6866-4391-ac47-43bd5312adb1\" (UID: \"3034b0d0-6866-4391-ac47-43bd5312adb1\") " Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.912567 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3034b0d0-6866-4391-ac47-43bd5312adb1-config" (OuterVolumeSpecName: "config") pod "3034b0d0-6866-4391-ac47-43bd5312adb1" (UID: "3034b0d0-6866-4391-ac47-43bd5312adb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.912748 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3034b0d0-6866-4391-ac47-43bd5312adb1-client-ca" (OuterVolumeSpecName: "client-ca") pod "3034b0d0-6866-4391-ac47-43bd5312adb1" (UID: "3034b0d0-6866-4391-ac47-43bd5312adb1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.913060 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bd82c99-6dc1-4fde-840f-fe13e78796ec-config\") pod \"route-controller-manager-fdc7bfbf4-hf9pm\" (UID: \"5bd82c99-6dc1-4fde-840f-fe13e78796ec\") " pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.913196 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bd82c99-6dc1-4fde-840f-fe13e78796ec-client-ca\") pod \"route-controller-manager-fdc7bfbf4-hf9pm\" (UID: \"5bd82c99-6dc1-4fde-840f-fe13e78796ec\") " pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.913338 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bd82c99-6dc1-4fde-840f-fe13e78796ec-serving-cert\") pod \"route-controller-manager-fdc7bfbf4-hf9pm\" (UID: \"5bd82c99-6dc1-4fde-840f-fe13e78796ec\") " pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.913452 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g74xv\" (UniqueName: \"kubernetes.io/projected/5bd82c99-6dc1-4fde-840f-fe13e78796ec-kube-api-access-g74xv\") pod \"route-controller-manager-fdc7bfbf4-hf9pm\" (UID: \"5bd82c99-6dc1-4fde-840f-fe13e78796ec\") " pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.913556 4941 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3034b0d0-6866-4391-ac47-43bd5312adb1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.913629 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3034b0d0-6866-4391-ac47-43bd5312adb1-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.919726 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3034b0d0-6866-4391-ac47-43bd5312adb1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3034b0d0-6866-4391-ac47-43bd5312adb1" (UID: "3034b0d0-6866-4391-ac47-43bd5312adb1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:56:35 crc kubenswrapper[4941]: I0307 06:56:35.927646 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3034b0d0-6866-4391-ac47-43bd5312adb1-kube-api-access-k4rnh" (OuterVolumeSpecName: "kube-api-access-k4rnh") pod "3034b0d0-6866-4391-ac47-43bd5312adb1" (UID: "3034b0d0-6866-4391-ac47-43bd5312adb1"). InnerVolumeSpecName "kube-api-access-k4rnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.015388 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bd82c99-6dc1-4fde-840f-fe13e78796ec-serving-cert\") pod \"route-controller-manager-fdc7bfbf4-hf9pm\" (UID: \"5bd82c99-6dc1-4fde-840f-fe13e78796ec\") " pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.015514 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g74xv\" (UniqueName: \"kubernetes.io/projected/5bd82c99-6dc1-4fde-840f-fe13e78796ec-kube-api-access-g74xv\") pod \"route-controller-manager-fdc7bfbf4-hf9pm\" (UID: \"5bd82c99-6dc1-4fde-840f-fe13e78796ec\") " pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.015584 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bd82c99-6dc1-4fde-840f-fe13e78796ec-config\") pod \"route-controller-manager-fdc7bfbf4-hf9pm\" (UID: \"5bd82c99-6dc1-4fde-840f-fe13e78796ec\") " pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.015686 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bd82c99-6dc1-4fde-840f-fe13e78796ec-client-ca\") pod \"route-controller-manager-fdc7bfbf4-hf9pm\" (UID: \"5bd82c99-6dc1-4fde-840f-fe13e78796ec\") " pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.015733 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4rnh\" (UniqueName: \"kubernetes.io/projected/3034b0d0-6866-4391-ac47-43bd5312adb1-kube-api-access-k4rnh\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.015747 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3034b0d0-6866-4391-ac47-43bd5312adb1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.017208 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bd82c99-6dc1-4fde-840f-fe13e78796ec-client-ca\") pod \"route-controller-manager-fdc7bfbf4-hf9pm\" (UID: \"5bd82c99-6dc1-4fde-840f-fe13e78796ec\") " pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.017420 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bd82c99-6dc1-4fde-840f-fe13e78796ec-config\") pod \"route-controller-manager-fdc7bfbf4-hf9pm\" (UID: \"5bd82c99-6dc1-4fde-840f-fe13e78796ec\") " pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.023623 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bd82c99-6dc1-4fde-840f-fe13e78796ec-serving-cert\") pod \"route-controller-manager-fdc7bfbf4-hf9pm\" (UID: \"5bd82c99-6dc1-4fde-840f-fe13e78796ec\") " pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.035182 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g74xv\" (UniqueName: \"kubernetes.io/projected/5bd82c99-6dc1-4fde-840f-fe13e78796ec-kube-api-access-g74xv\") pod \"route-controller-manager-fdc7bfbf4-hf9pm\" (UID: \"5bd82c99-6dc1-4fde-840f-fe13e78796ec\") " pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.085870 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.086056 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.132871 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.212185 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.325349 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.325538 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.344533 4941 patch_prober.go:28] interesting pod/controller-manager-86495f659f-6xnjp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.344601 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" podUID="75569bd2-5c18-4a4d-9545-210a324a4b7a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.460708 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.600516 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.601035 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277" event={"ID":"3034b0d0-6866-4391-ac47-43bd5312adb1","Type":"ContainerDied","Data":"f1ba742c00b0b30829ae47befd1eafabdd15de2ea8bed6826082e505b3aee2dd"} Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.601078 4941 scope.go:117] "RemoveContainer" containerID="29c439e1e6aac1934093b4cb7638023a26d4a89f0d1676bb0f6234ec5a9b7f66" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.639820 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277"] Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.651841 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57778f6ddf-cx277"] Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.657922 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.663833 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.674641 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.693052 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.739961 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm"] Mar 07 06:56:36 crc kubenswrapper[4941]: W0307 06:56:36.760249 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bd82c99_6dc1_4fde_840f_fe13e78796ec.slice/crio-3fc4dd72ae140651e366883640519ae34a8fb23754ef9af241403c71dbd686ad WatchSource:0}: Error finding container 3fc4dd72ae140651e366883640519ae34a8fb23754ef9af241403c71dbd686ad: Status 404 returned error can't find the container with id 3fc4dd72ae140651e366883640519ae34a8fb23754ef9af241403c71dbd686ad Mar 07 06:56:36 crc kubenswrapper[4941]: I0307 06:56:36.923387 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.040379 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-proxy-ca-bundles\") pod \"75569bd2-5c18-4a4d-9545-210a324a4b7a\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.040458 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-config\") pod \"75569bd2-5c18-4a4d-9545-210a324a4b7a\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.040527 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75569bd2-5c18-4a4d-9545-210a324a4b7a-serving-cert\") pod \"75569bd2-5c18-4a4d-9545-210a324a4b7a\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.040560 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w9zx\" (UniqueName: \"kubernetes.io/projected/75569bd2-5c18-4a4d-9545-210a324a4b7a-kube-api-access-6w9zx\") pod \"75569bd2-5c18-4a4d-9545-210a324a4b7a\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.040610 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-client-ca\") pod \"75569bd2-5c18-4a4d-9545-210a324a4b7a\" (UID: \"75569bd2-5c18-4a4d-9545-210a324a4b7a\") " Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.041731 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-client-ca" (OuterVolumeSpecName: "client-ca") pod "75569bd2-5c18-4a4d-9545-210a324a4b7a" (UID: "75569bd2-5c18-4a4d-9545-210a324a4b7a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.043667 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "75569bd2-5c18-4a4d-9545-210a324a4b7a" (UID: "75569bd2-5c18-4a4d-9545-210a324a4b7a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.043681 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-config" (OuterVolumeSpecName: "config") pod "75569bd2-5c18-4a4d-9545-210a324a4b7a" (UID: "75569bd2-5c18-4a4d-9545-210a324a4b7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.049425 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75569bd2-5c18-4a4d-9545-210a324a4b7a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "75569bd2-5c18-4a4d-9545-210a324a4b7a" (UID: "75569bd2-5c18-4a4d-9545-210a324a4b7a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.049480 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75569bd2-5c18-4a4d-9545-210a324a4b7a-kube-api-access-6w9zx" (OuterVolumeSpecName: "kube-api-access-6w9zx") pod "75569bd2-5c18-4a4d-9545-210a324a4b7a" (UID: "75569bd2-5c18-4a4d-9545-210a324a4b7a"). InnerVolumeSpecName "kube-api-access-6w9zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.143887 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w9zx\" (UniqueName: \"kubernetes.io/projected/75569bd2-5c18-4a4d-9545-210a324a4b7a-kube-api-access-6w9zx\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.143950 4941 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.143966 4941 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.143977 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75569bd2-5c18-4a4d-9545-210a324a4b7a-config\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.143987 4941 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75569bd2-5c18-4a4d-9545-210a324a4b7a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.608164 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" event={"ID":"5bd82c99-6dc1-4fde-840f-fe13e78796ec","Type":"ContainerStarted","Data":"e0e4e4ba71be908a14004accbc9696a1d7116765873a2f48d87d519f7d6a3474"} Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.608818 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" event={"ID":"5bd82c99-6dc1-4fde-840f-fe13e78796ec","Type":"ContainerStarted","Data":"3fc4dd72ae140651e366883640519ae34a8fb23754ef9af241403c71dbd686ad"} Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.608865 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.610446 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" event={"ID":"75569bd2-5c18-4a4d-9545-210a324a4b7a","Type":"ContainerDied","Data":"b72ea7fa8f146afe4eb6bdbfecb5c5aea96475063341d60752ee2efe30522b26"} Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.610486 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86495f659f-6xnjp" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.610501 4941 scope.go:117] "RemoveContainer" containerID="a083e23f156d4f42ac69007c05a3ea402a6501e69ad7110eaa2e8dc0531af0e6" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.631768 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" podStartSLOduration=3.63174348 podStartE2EDuration="3.63174348s" podCreationTimestamp="2026-03-07 06:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:56:37.63098146 +0000 UTC m=+294.583346925" watchObservedRunningTime="2026-03-07 06:56:37.63174348 +0000 UTC m=+294.584108945" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.647637 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-fdc7bfbf4-hf9pm" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.655145 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86495f659f-6xnjp"] Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.661140 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86495f659f-6xnjp"] Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.781317 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.781357 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.821910 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.963987 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3034b0d0-6866-4391-ac47-43bd5312adb1" path="/var/lib/kubelet/pods/3034b0d0-6866-4391-ac47-43bd5312adb1/volumes" Mar 07 06:56:37 crc kubenswrapper[4941]: I0307 06:56:37.964524 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75569bd2-5c18-4a4d-9545-210a324a4b7a" path="/var/lib/kubelet/pods/75569bd2-5c18-4a4d-9545-210a324a4b7a/volumes" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.015342 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b7879745c-2vlkm"] Mar 07 06:56:38 crc kubenswrapper[4941]: E0307 06:56:38.015622 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75569bd2-5c18-4a4d-9545-210a324a4b7a" containerName="controller-manager" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.015636 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="75569bd2-5c18-4a4d-9545-210a324a4b7a" containerName="controller-manager" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.015745 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="75569bd2-5c18-4a4d-9545-210a324a4b7a" containerName="controller-manager" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.016148 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.018466 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.019041 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.019066 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.019344 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.019654 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.019800 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.025229 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b7879745c-2vlkm"] Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.026745 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.163110 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7ln7\" (UniqueName: \"kubernetes.io/projected/2adb33d0-2028-4232-af39-3753e731ab76-kube-api-access-x7ln7\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.163168 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2adb33d0-2028-4232-af39-3753e731ab76-proxy-ca-bundles\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.163216 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2adb33d0-2028-4232-af39-3753e731ab76-config\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.163250 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2adb33d0-2028-4232-af39-3753e731ab76-client-ca\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.163279 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2adb33d0-2028-4232-af39-3753e731ab76-serving-cert\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.182113 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.182195 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.238611 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.264713 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2adb33d0-2028-4232-af39-3753e731ab76-serving-cert\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.264823 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7ln7\" (UniqueName: \"kubernetes.io/projected/2adb33d0-2028-4232-af39-3753e731ab76-kube-api-access-x7ln7\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.264854 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2adb33d0-2028-4232-af39-3753e731ab76-proxy-ca-bundles\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.264877 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2adb33d0-2028-4232-af39-3753e731ab76-config\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.264907 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2adb33d0-2028-4232-af39-3753e731ab76-client-ca\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.266108 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2adb33d0-2028-4232-af39-3753e731ab76-client-ca\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.266994 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2adb33d0-2028-4232-af39-3753e731ab76-proxy-ca-bundles\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.267203 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2adb33d0-2028-4232-af39-3753e731ab76-config\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.273038 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2adb33d0-2028-4232-af39-3753e731ab76-serving-cert\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.286447 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7ln7\" (UniqueName: \"kubernetes.io/projected/2adb33d0-2028-4232-af39-3753e731ab76-kube-api-access-x7ln7\") pod \"controller-manager-7b7879745c-2vlkm\" (UID: \"2adb33d0-2028-4232-af39-3753e731ab76\") " pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.339019 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.616671 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b7879745c-2vlkm"] Mar 07 06:56:38 crc kubenswrapper[4941]: W0307 06:56:38.621988 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2adb33d0_2028_4232_af39_3753e731ab76.slice/crio-777a5542d37529017f3357434c06baa871a3bf535b26a943ddc33935ccf573da WatchSource:0}: Error finding container 777a5542d37529017f3357434c06baa871a3bf535b26a943ddc33935ccf573da: Status 404 returned error can't find the container with id 777a5542d37529017f3357434c06baa871a3bf535b26a943ddc33935ccf573da Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.669096 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.694393 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:56:38 crc kubenswrapper[4941]: I0307 06:56:38.841872 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jg549"] Mar 07 06:56:39 crc kubenswrapper[4941]: I0307 06:56:39.184890 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:56:39 crc kubenswrapper[4941]: I0307 06:56:39.184958 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:56:39 crc kubenswrapper[4941]: I0307 06:56:39.220258 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:56:39 crc kubenswrapper[4941]: I0307 06:56:39.628422 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" event={"ID":"2adb33d0-2028-4232-af39-3753e731ab76","Type":"ContainerStarted","Data":"d34a3f6e61b98ffcc1ffe66c01ffa2add6b4386057ac119a18d81cd6af07ea90"} Mar 07 06:56:39 crc kubenswrapper[4941]: I0307 06:56:39.629745 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:39 crc kubenswrapper[4941]: I0307 06:56:39.629768 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" event={"ID":"2adb33d0-2028-4232-af39-3753e731ab76","Type":"ContainerStarted","Data":"777a5542d37529017f3357434c06baa871a3bf535b26a943ddc33935ccf573da"} Mar 07 06:56:39 crc kubenswrapper[4941]: I0307 06:56:39.629782 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jg549" podUID="86719fee-4b62-4f53-958e-9e87f56a9062" containerName="registry-server" containerID="cri-o://0e91ec8e8163e781edeba040ac125e6490e63867c8325e10d9130472c7b739c2" gracePeriod=2 Mar 07 06:56:39 crc kubenswrapper[4941]: I0307 06:56:39.630740 4941 patch_prober.go:28] interesting pod/controller-manager-7b7879745c-2vlkm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" start-of-body= Mar 07 06:56:39 crc kubenswrapper[4941]: I0307 06:56:39.630782 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" podUID="2adb33d0-2028-4232-af39-3753e731ab76" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" Mar 07 06:56:39 crc kubenswrapper[4941]: I0307 06:56:39.686951 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:56:39 crc kubenswrapper[4941]: I0307 06:56:39.704931 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" podStartSLOduration=5.704908861 podStartE2EDuration="5.704908861s" podCreationTimestamp="2026-03-07 06:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:56:39.657043335 +0000 UTC m=+296.609408800" watchObservedRunningTime="2026-03-07 06:56:39.704908861 +0000 UTC m=+296.657274326" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.054864 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.194792 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86719fee-4b62-4f53-958e-9e87f56a9062-utilities\") pod \"86719fee-4b62-4f53-958e-9e87f56a9062\" (UID: \"86719fee-4b62-4f53-958e-9e87f56a9062\") " Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.194869 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4pcp\" (UniqueName: \"kubernetes.io/projected/86719fee-4b62-4f53-958e-9e87f56a9062-kube-api-access-x4pcp\") pod \"86719fee-4b62-4f53-958e-9e87f56a9062\" (UID: \"86719fee-4b62-4f53-958e-9e87f56a9062\") " Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.194911 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86719fee-4b62-4f53-958e-9e87f56a9062-catalog-content\") pod \"86719fee-4b62-4f53-958e-9e87f56a9062\" (UID: \"86719fee-4b62-4f53-958e-9e87f56a9062\") " Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.195628 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86719fee-4b62-4f53-958e-9e87f56a9062-utilities" (OuterVolumeSpecName: "utilities") pod "86719fee-4b62-4f53-958e-9e87f56a9062" (UID: "86719fee-4b62-4f53-958e-9e87f56a9062"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.203340 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86719fee-4b62-4f53-958e-9e87f56a9062-kube-api-access-x4pcp" (OuterVolumeSpecName: "kube-api-access-x4pcp") pod "86719fee-4b62-4f53-958e-9e87f56a9062" (UID: "86719fee-4b62-4f53-958e-9e87f56a9062"). InnerVolumeSpecName "kube-api-access-x4pcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.256998 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86719fee-4b62-4f53-958e-9e87f56a9062-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86719fee-4b62-4f53-958e-9e87f56a9062" (UID: "86719fee-4b62-4f53-958e-9e87f56a9062"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.295986 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86719fee-4b62-4f53-958e-9e87f56a9062-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.296029 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4pcp\" (UniqueName: \"kubernetes.io/projected/86719fee-4b62-4f53-958e-9e87f56a9062-kube-api-access-x4pcp\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.296042 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86719fee-4b62-4f53-958e-9e87f56a9062-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.639963 4941 generic.go:334] "Generic (PLEG): container finished" podID="86719fee-4b62-4f53-958e-9e87f56a9062" containerID="0e91ec8e8163e781edeba040ac125e6490e63867c8325e10d9130472c7b739c2" exitCode=0 Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.640527 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jg549" event={"ID":"86719fee-4b62-4f53-958e-9e87f56a9062","Type":"ContainerDied","Data":"0e91ec8e8163e781edeba040ac125e6490e63867c8325e10d9130472c7b739c2"} Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.640604 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jg549" event={"ID":"86719fee-4b62-4f53-958e-9e87f56a9062","Type":"ContainerDied","Data":"ec4f39fc91b5ef4a17e019bb59dc79248af52881668f6be434bea4bb8b658d76"} Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.640639 4941 scope.go:117] "RemoveContainer" containerID="0e91ec8e8163e781edeba040ac125e6490e63867c8325e10d9130472c7b739c2" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.640810 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jg549" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.641660 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xsszj"] Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.642054 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xsszj" podUID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" containerName="registry-server" containerID="cri-o://ad1fec307fe9dff157fe4ba6fc0bc60c95a15d31df170fb025518474f82dbef7" gracePeriod=2 Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.648723 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b7879745c-2vlkm" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.663043 4941 scope.go:117] "RemoveContainer" containerID="cf43840be0d0d7343c046ab925a4019b3c8dfe6aa43ef094d4a19d712fdada0d" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.694530 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jg549"] Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.696029 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jg549"] Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.705803 4941 scope.go:117] "RemoveContainer" containerID="6dc60f6de787d1ff1aa2dd430fa3f151dca6fb0653c81de6ddafb1bb943b2b5c" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.719471 4941 scope.go:117] "RemoveContainer" containerID="0e91ec8e8163e781edeba040ac125e6490e63867c8325e10d9130472c7b739c2" Mar 07 06:56:40 crc kubenswrapper[4941]: E0307 06:56:40.719883 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e91ec8e8163e781edeba040ac125e6490e63867c8325e10d9130472c7b739c2\": container with ID starting with 0e91ec8e8163e781edeba040ac125e6490e63867c8325e10d9130472c7b739c2 not found: ID does not exist" containerID="0e91ec8e8163e781edeba040ac125e6490e63867c8325e10d9130472c7b739c2" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.720026 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e91ec8e8163e781edeba040ac125e6490e63867c8325e10d9130472c7b739c2"} err="failed to get container status \"0e91ec8e8163e781edeba040ac125e6490e63867c8325e10d9130472c7b739c2\": rpc error: code = NotFound desc = could not find container \"0e91ec8e8163e781edeba040ac125e6490e63867c8325e10d9130472c7b739c2\": container with ID starting with 0e91ec8e8163e781edeba040ac125e6490e63867c8325e10d9130472c7b739c2 not found: ID does not exist" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.720154 4941 scope.go:117] "RemoveContainer" containerID="cf43840be0d0d7343c046ab925a4019b3c8dfe6aa43ef094d4a19d712fdada0d" Mar 07 06:56:40 crc kubenswrapper[4941]: E0307 06:56:40.720601 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf43840be0d0d7343c046ab925a4019b3c8dfe6aa43ef094d4a19d712fdada0d\": container with ID starting with cf43840be0d0d7343c046ab925a4019b3c8dfe6aa43ef094d4a19d712fdada0d not found: ID does not exist" containerID="cf43840be0d0d7343c046ab925a4019b3c8dfe6aa43ef094d4a19d712fdada0d" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.720629 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf43840be0d0d7343c046ab925a4019b3c8dfe6aa43ef094d4a19d712fdada0d"} err="failed to get container status \"cf43840be0d0d7343c046ab925a4019b3c8dfe6aa43ef094d4a19d712fdada0d\": rpc error: code = NotFound desc = could not find container \"cf43840be0d0d7343c046ab925a4019b3c8dfe6aa43ef094d4a19d712fdada0d\": container with ID starting with cf43840be0d0d7343c046ab925a4019b3c8dfe6aa43ef094d4a19d712fdada0d not found: ID does not exist" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.720646 4941 scope.go:117] "RemoveContainer" containerID="6dc60f6de787d1ff1aa2dd430fa3f151dca6fb0653c81de6ddafb1bb943b2b5c" Mar 07 06:56:40 crc kubenswrapper[4941]: E0307 06:56:40.720910 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dc60f6de787d1ff1aa2dd430fa3f151dca6fb0653c81de6ddafb1bb943b2b5c\": container with ID starting with 6dc60f6de787d1ff1aa2dd430fa3f151dca6fb0653c81de6ddafb1bb943b2b5c not found: ID does not exist" containerID="6dc60f6de787d1ff1aa2dd430fa3f151dca6fb0653c81de6ddafb1bb943b2b5c" Mar 07 06:56:40 crc kubenswrapper[4941]: I0307 06:56:40.721005 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc60f6de787d1ff1aa2dd430fa3f151dca6fb0653c81de6ddafb1bb943b2b5c"} err="failed to get container status \"6dc60f6de787d1ff1aa2dd430fa3f151dca6fb0653c81de6ddafb1bb943b2b5c\": rpc error: code = NotFound desc = could not find container \"6dc60f6de787d1ff1aa2dd430fa3f151dca6fb0653c81de6ddafb1bb943b2b5c\": container with ID starting with 6dc60f6de787d1ff1aa2dd430fa3f151dca6fb0653c81de6ddafb1bb943b2b5c not found: ID does not exist" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.243855 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtxk7"] Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.246648 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vtxk7" podUID="37622fc0-c5dc-4e0d-848a-214bce293f7f" containerName="registry-server" containerID="cri-o://3a34d313b7cc6c5ec835038db8ec4ce254948654c6dc501cf13d7b2cf6696171" gracePeriod=2 Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.271547 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.413661 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qhtv\" (UniqueName: \"kubernetes.io/projected/715e8d60-13c8-442f-bec0-2f2fd1cfe172-kube-api-access-8qhtv\") pod \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\" (UID: \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\") " Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.413730 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715e8d60-13c8-442f-bec0-2f2fd1cfe172-catalog-content\") pod \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\" (UID: \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\") " Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.413861 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715e8d60-13c8-442f-bec0-2f2fd1cfe172-utilities\") pod \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\" (UID: \"715e8d60-13c8-442f-bec0-2f2fd1cfe172\") " Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.415099 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715e8d60-13c8-442f-bec0-2f2fd1cfe172-utilities" (OuterVolumeSpecName: "utilities") pod "715e8d60-13c8-442f-bec0-2f2fd1cfe172" (UID: "715e8d60-13c8-442f-bec0-2f2fd1cfe172"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.419554 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715e8d60-13c8-442f-bec0-2f2fd1cfe172-kube-api-access-8qhtv" (OuterVolumeSpecName: "kube-api-access-8qhtv") pod "715e8d60-13c8-442f-bec0-2f2fd1cfe172" (UID: "715e8d60-13c8-442f-bec0-2f2fd1cfe172"). InnerVolumeSpecName "kube-api-access-8qhtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.479933 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715e8d60-13c8-442f-bec0-2f2fd1cfe172-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "715e8d60-13c8-442f-bec0-2f2fd1cfe172" (UID: "715e8d60-13c8-442f-bec0-2f2fd1cfe172"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.515558 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715e8d60-13c8-442f-bec0-2f2fd1cfe172-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.515594 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715e8d60-13c8-442f-bec0-2f2fd1cfe172-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.515604 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qhtv\" (UniqueName: \"kubernetes.io/projected/715e8d60-13c8-442f-bec0-2f2fd1cfe172-kube-api-access-8qhtv\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.618750 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.655260 4941 generic.go:334] "Generic (PLEG): container finished" podID="37622fc0-c5dc-4e0d-848a-214bce293f7f" containerID="3a34d313b7cc6c5ec835038db8ec4ce254948654c6dc501cf13d7b2cf6696171" exitCode=0 Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.655308 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtxk7" event={"ID":"37622fc0-c5dc-4e0d-848a-214bce293f7f","Type":"ContainerDied","Data":"3a34d313b7cc6c5ec835038db8ec4ce254948654c6dc501cf13d7b2cf6696171"} Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.655349 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtxk7" event={"ID":"37622fc0-c5dc-4e0d-848a-214bce293f7f","Type":"ContainerDied","Data":"7d9a7372a18ca6df94bb77a20939c88d162938d62478d6229c493fa790ffbc7c"} Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.655367 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtxk7" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.655386 4941 scope.go:117] "RemoveContainer" containerID="3a34d313b7cc6c5ec835038db8ec4ce254948654c6dc501cf13d7b2cf6696171" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.660362 4941 generic.go:334] "Generic (PLEG): container finished" podID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" containerID="ad1fec307fe9dff157fe4ba6fc0bc60c95a15d31df170fb025518474f82dbef7" exitCode=0 Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.660433 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsszj" event={"ID":"715e8d60-13c8-442f-bec0-2f2fd1cfe172","Type":"ContainerDied","Data":"ad1fec307fe9dff157fe4ba6fc0bc60c95a15d31df170fb025518474f82dbef7"} Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.660456 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsszj" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.660497 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsszj" event={"ID":"715e8d60-13c8-442f-bec0-2f2fd1cfe172","Type":"ContainerDied","Data":"35bce658649352d6294ec01b46c93c6ca0f9142a68d0b106ef69d00733c03f14"} Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.674077 4941 scope.go:117] "RemoveContainer" containerID="9e8a66df85863050c4fb353c15a57d00c282e6a6a9c6309060302c32b3426a34" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.693744 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xsszj"] Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.695697 4941 scope.go:117] "RemoveContainer" containerID="1177b5251f7bbad1bb8f8bb263363d08461e953fe922334fd85ea409f4423561" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.701005 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xsszj"] Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.714336 4941 scope.go:117] "RemoveContainer" containerID="3a34d313b7cc6c5ec835038db8ec4ce254948654c6dc501cf13d7b2cf6696171" Mar 07 06:56:41 crc kubenswrapper[4941]: E0307 06:56:41.714659 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a34d313b7cc6c5ec835038db8ec4ce254948654c6dc501cf13d7b2cf6696171\": container with ID starting with 3a34d313b7cc6c5ec835038db8ec4ce254948654c6dc501cf13d7b2cf6696171 not found: ID does not exist" containerID="3a34d313b7cc6c5ec835038db8ec4ce254948654c6dc501cf13d7b2cf6696171" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.714700 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a34d313b7cc6c5ec835038db8ec4ce254948654c6dc501cf13d7b2cf6696171"} err="failed to get container status \"3a34d313b7cc6c5ec835038db8ec4ce254948654c6dc501cf13d7b2cf6696171\": rpc error: code = NotFound desc = could not find container \"3a34d313b7cc6c5ec835038db8ec4ce254948654c6dc501cf13d7b2cf6696171\": container with ID starting with 3a34d313b7cc6c5ec835038db8ec4ce254948654c6dc501cf13d7b2cf6696171 not found: ID does not exist" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.714727 4941 scope.go:117] "RemoveContainer" containerID="9e8a66df85863050c4fb353c15a57d00c282e6a6a9c6309060302c32b3426a34" Mar 07 06:56:41 crc kubenswrapper[4941]: E0307 06:56:41.715119 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e8a66df85863050c4fb353c15a57d00c282e6a6a9c6309060302c32b3426a34\": container with ID starting with 9e8a66df85863050c4fb353c15a57d00c282e6a6a9c6309060302c32b3426a34 not found: ID does not exist" containerID="9e8a66df85863050c4fb353c15a57d00c282e6a6a9c6309060302c32b3426a34" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.715147 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e8a66df85863050c4fb353c15a57d00c282e6a6a9c6309060302c32b3426a34"} err="failed to get container status \"9e8a66df85863050c4fb353c15a57d00c282e6a6a9c6309060302c32b3426a34\": rpc error: code = NotFound desc = could not find container \"9e8a66df85863050c4fb353c15a57d00c282e6a6a9c6309060302c32b3426a34\": container with ID starting with 9e8a66df85863050c4fb353c15a57d00c282e6a6a9c6309060302c32b3426a34 not found: ID does not exist" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.715163 4941 scope.go:117] "RemoveContainer" containerID="1177b5251f7bbad1bb8f8bb263363d08461e953fe922334fd85ea409f4423561" Mar 07 06:56:41 crc kubenswrapper[4941]: E0307 06:56:41.715390 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1177b5251f7bbad1bb8f8bb263363d08461e953fe922334fd85ea409f4423561\": container with ID starting with 1177b5251f7bbad1bb8f8bb263363d08461e953fe922334fd85ea409f4423561 not found: ID does not exist" containerID="1177b5251f7bbad1bb8f8bb263363d08461e953fe922334fd85ea409f4423561" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.715436 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1177b5251f7bbad1bb8f8bb263363d08461e953fe922334fd85ea409f4423561"} err="failed to get container status \"1177b5251f7bbad1bb8f8bb263363d08461e953fe922334fd85ea409f4423561\": rpc error: code = NotFound desc = could not find container \"1177b5251f7bbad1bb8f8bb263363d08461e953fe922334fd85ea409f4423561\": container with ID starting with 1177b5251f7bbad1bb8f8bb263363d08461e953fe922334fd85ea409f4423561 not found: ID does not exist" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.715459 4941 scope.go:117] "RemoveContainer" containerID="ad1fec307fe9dff157fe4ba6fc0bc60c95a15d31df170fb025518474f82dbef7" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.717626 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qw9x\" (UniqueName: \"kubernetes.io/projected/37622fc0-c5dc-4e0d-848a-214bce293f7f-kube-api-access-7qw9x\") pod \"37622fc0-c5dc-4e0d-848a-214bce293f7f\" (UID: \"37622fc0-c5dc-4e0d-848a-214bce293f7f\") " Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.717755 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37622fc0-c5dc-4e0d-848a-214bce293f7f-catalog-content\") pod \"37622fc0-c5dc-4e0d-848a-214bce293f7f\" (UID: \"37622fc0-c5dc-4e0d-848a-214bce293f7f\") " Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.717842 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37622fc0-c5dc-4e0d-848a-214bce293f7f-utilities\") pod \"37622fc0-c5dc-4e0d-848a-214bce293f7f\" (UID: \"37622fc0-c5dc-4e0d-848a-214bce293f7f\") " Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.718695 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37622fc0-c5dc-4e0d-848a-214bce293f7f-utilities" (OuterVolumeSpecName: "utilities") pod "37622fc0-c5dc-4e0d-848a-214bce293f7f" (UID: "37622fc0-c5dc-4e0d-848a-214bce293f7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.721659 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37622fc0-c5dc-4e0d-848a-214bce293f7f-kube-api-access-7qw9x" (OuterVolumeSpecName: "kube-api-access-7qw9x") pod "37622fc0-c5dc-4e0d-848a-214bce293f7f" (UID: "37622fc0-c5dc-4e0d-848a-214bce293f7f"). InnerVolumeSpecName "kube-api-access-7qw9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.726566 4941 scope.go:117] "RemoveContainer" containerID="c3be598c099f0409cedbbad181bc9ccca85cdb3724d2a1570c21ff57316be0fa" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.740517 4941 scope.go:117] "RemoveContainer" containerID="83a6b64fddd3d37f9bff1a6b5308c209ad5ca7a7d2a5c7cc289127b581440286" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.743234 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37622fc0-c5dc-4e0d-848a-214bce293f7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37622fc0-c5dc-4e0d-848a-214bce293f7f" (UID: "37622fc0-c5dc-4e0d-848a-214bce293f7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.751154 4941 scope.go:117] "RemoveContainer" containerID="ad1fec307fe9dff157fe4ba6fc0bc60c95a15d31df170fb025518474f82dbef7" Mar 07 06:56:41 crc kubenswrapper[4941]: E0307 06:56:41.751483 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad1fec307fe9dff157fe4ba6fc0bc60c95a15d31df170fb025518474f82dbef7\": container with ID starting with ad1fec307fe9dff157fe4ba6fc0bc60c95a15d31df170fb025518474f82dbef7 not found: ID does not exist" containerID="ad1fec307fe9dff157fe4ba6fc0bc60c95a15d31df170fb025518474f82dbef7" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.751527 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad1fec307fe9dff157fe4ba6fc0bc60c95a15d31df170fb025518474f82dbef7"} err="failed to get container status \"ad1fec307fe9dff157fe4ba6fc0bc60c95a15d31df170fb025518474f82dbef7\": rpc error: code = NotFound desc = could not find container \"ad1fec307fe9dff157fe4ba6fc0bc60c95a15d31df170fb025518474f82dbef7\": container with ID starting with ad1fec307fe9dff157fe4ba6fc0bc60c95a15d31df170fb025518474f82dbef7 not found: ID does not exist" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.751553 4941 scope.go:117] "RemoveContainer" containerID="c3be598c099f0409cedbbad181bc9ccca85cdb3724d2a1570c21ff57316be0fa" Mar 07 06:56:41 crc kubenswrapper[4941]: E0307 06:56:41.751769 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3be598c099f0409cedbbad181bc9ccca85cdb3724d2a1570c21ff57316be0fa\": container with ID starting with c3be598c099f0409cedbbad181bc9ccca85cdb3724d2a1570c21ff57316be0fa not found: ID does not exist" containerID="c3be598c099f0409cedbbad181bc9ccca85cdb3724d2a1570c21ff57316be0fa" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.751798 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3be598c099f0409cedbbad181bc9ccca85cdb3724d2a1570c21ff57316be0fa"} err="failed to get container status \"c3be598c099f0409cedbbad181bc9ccca85cdb3724d2a1570c21ff57316be0fa\": rpc error: code = NotFound desc = could not find container \"c3be598c099f0409cedbbad181bc9ccca85cdb3724d2a1570c21ff57316be0fa\": container with ID starting with c3be598c099f0409cedbbad181bc9ccca85cdb3724d2a1570c21ff57316be0fa not found: ID does not exist" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.751814 4941 scope.go:117] "RemoveContainer" containerID="83a6b64fddd3d37f9bff1a6b5308c209ad5ca7a7d2a5c7cc289127b581440286" Mar 07 06:56:41 crc kubenswrapper[4941]: E0307 06:56:41.752018 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83a6b64fddd3d37f9bff1a6b5308c209ad5ca7a7d2a5c7cc289127b581440286\": container with ID starting with 83a6b64fddd3d37f9bff1a6b5308c209ad5ca7a7d2a5c7cc289127b581440286 not found: ID does not exist" containerID="83a6b64fddd3d37f9bff1a6b5308c209ad5ca7a7d2a5c7cc289127b581440286" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.752039 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a6b64fddd3d37f9bff1a6b5308c209ad5ca7a7d2a5c7cc289127b581440286"} err="failed to get container status \"83a6b64fddd3d37f9bff1a6b5308c209ad5ca7a7d2a5c7cc289127b581440286\": rpc error: code = NotFound desc = could not find container \"83a6b64fddd3d37f9bff1a6b5308c209ad5ca7a7d2a5c7cc289127b581440286\": container with ID starting with 83a6b64fddd3d37f9bff1a6b5308c209ad5ca7a7d2a5c7cc289127b581440286 not found: ID does not exist" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.820817 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37622fc0-c5dc-4e0d-848a-214bce293f7f-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.821727 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qw9x\" (UniqueName: \"kubernetes.io/projected/37622fc0-c5dc-4e0d-848a-214bce293f7f-kube-api-access-7qw9x\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.821773 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37622fc0-c5dc-4e0d-848a-214bce293f7f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.969703 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" path="/var/lib/kubelet/pods/715e8d60-13c8-442f-bec0-2f2fd1cfe172/volumes" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.971162 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86719fee-4b62-4f53-958e-9e87f56a9062" path="/var/lib/kubelet/pods/86719fee-4b62-4f53-958e-9e87f56a9062/volumes" Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.989058 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtxk7"] Mar 07 06:56:41 crc kubenswrapper[4941]: I0307 06:56:41.994905 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtxk7"] Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.257925 4941 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.258336 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" containerName="extract-content" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.258361 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" containerName="extract-content" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.258381 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" containerName="extract-utilities" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.258393 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" containerName="extract-utilities" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.258444 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37622fc0-c5dc-4e0d-848a-214bce293f7f" containerName="registry-server" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.258458 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="37622fc0-c5dc-4e0d-848a-214bce293f7f" containerName="registry-server" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.258485 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86719fee-4b62-4f53-958e-9e87f56a9062" containerName="registry-server" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.258497 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="86719fee-4b62-4f53-958e-9e87f56a9062" containerName="registry-server" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.258514 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86719fee-4b62-4f53-958e-9e87f56a9062" containerName="extract-utilities" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.258529 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="86719fee-4b62-4f53-958e-9e87f56a9062" containerName="extract-utilities" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.258545 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" containerName="registry-server" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.258557 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" containerName="registry-server" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.258576 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37622fc0-c5dc-4e0d-848a-214bce293f7f" containerName="extract-utilities" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.258587 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="37622fc0-c5dc-4e0d-848a-214bce293f7f" containerName="extract-utilities" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.258612 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37622fc0-c5dc-4e0d-848a-214bce293f7f" containerName="extract-content" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.258624 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="37622fc0-c5dc-4e0d-848a-214bce293f7f" containerName="extract-content" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.258640 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86719fee-4b62-4f53-958e-9e87f56a9062" containerName="extract-content" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.258652 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="86719fee-4b62-4f53-958e-9e87f56a9062" containerName="extract-content" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.258871 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="715e8d60-13c8-442f-bec0-2f2fd1cfe172" containerName="registry-server" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.258908 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="37622fc0-c5dc-4e0d-848a-214bce293f7f" containerName="registry-server" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.258929 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="86719fee-4b62-4f53-958e-9e87f56a9062" containerName="registry-server" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.259594 4941 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.259739 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.260232 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76" gracePeriod=15 Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.260299 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a" gracePeriod=15 Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.260382 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96" gracePeriod=15 Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.260446 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148" gracePeriod=15 Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.260520 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f" gracePeriod=15 Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.261191 4941 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.261548 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.261570 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.261585 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.261611 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.261628 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.261641 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.261658 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.261670 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.261685 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.261698 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.261721 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.261733 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.261749 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.261763 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.261779 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.261791 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.261974 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.261992 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.262005 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.262022 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.262037 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.262054 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.262072 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.262233 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.262248 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.262447 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.262627 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.262641 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.262810 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.308151 4941 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.429993 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.430188 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.430426 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.430489 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.430537 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.430561 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.430589 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.430664 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.521582 4941 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.521646 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.531767 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.531874 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.531931 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.531971 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.531994 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.531998 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.532067 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.532091 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.532076 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.532069 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.532169 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.532195 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.532262 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.532319 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.532382 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.532324 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.912801 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.929198 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.930822 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 07 06:56:42 crc kubenswrapper[4941]: I0307 06:56:42.931785 4941 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148" exitCode=2 Mar 07 06:56:42 crc kubenswrapper[4941]: E0307 06:56:42.934176 4941 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189a7cc84603cb8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:56:42.93366875 +0000 UTC m=+299.886034215,LastTimestamp:2026-03-07 06:56:42.93366875 +0000 UTC m=+299.886034215,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.721478 4941 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.721873 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.941079 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"007afaa50d01ddf1e2b67f5cae2f3737649123512558d44d885b01bca4fef43b"} Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.941157 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"30ba40ce306fd811319beabb7ea534eb21289871ae64b3e6e9690d4ca0261b6c"} Mar 07 06:56:43 crc kubenswrapper[4941]: E0307 06:56:43.942037 4941 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.942050 4941 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.943905 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.945471 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.946716 4941 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a" exitCode=0 Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.946744 4941 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f" exitCode=0 Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.946756 4941 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96" exitCode=0 Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.946798 4941 scope.go:117] "RemoveContainer" containerID="e63967162959653167b155830e5910be543c2cd419b26ea8e076b8f8d6b9b4bd" Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.949685 4941 generic.go:334] "Generic (PLEG): container finished" podID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" containerID="b380491549d4656d158f042ce2c960e96d68a628025a0caceea954b15a9a0b93" exitCode=0 Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.949713 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b5abd453-0ae9-420c-92b5-84b76e1b4a6a","Type":"ContainerDied","Data":"b380491549d4656d158f042ce2c960e96d68a628025a0caceea954b15a9a0b93"} Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.950282 4941 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.950574 4941 status_manager.go:851] "Failed to get status for pod" podUID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.959568 4941 status_manager.go:851] "Failed to get status for pod" podUID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.959842 4941 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:43 crc kubenswrapper[4941]: I0307 06:56:43.986032 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37622fc0-c5dc-4e0d-848a-214bce293f7f" path="/var/lib/kubelet/pods/37622fc0-c5dc-4e0d-848a-214bce293f7f/volumes" Mar 07 06:56:44 crc kubenswrapper[4941]: I0307 06:56:44.960602 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 07 06:56:44 crc kubenswrapper[4941]: I0307 06:56:44.962008 4941 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76" exitCode=0 Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.180452 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.182861 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.183627 4941 status_manager.go:851] "Failed to get status for pod" podUID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.183973 4941 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.337400 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.338070 4941 status_manager.go:851] "Failed to get status for pod" podUID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.338415 4941 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.356345 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.356415 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.356477 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.356759 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.356790 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.356809 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.458645 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-kube-api-access\") pod \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\" (UID: \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\") " Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.458767 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-var-lock\") pod \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\" (UID: \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\") " Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.458804 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-kubelet-dir\") pod \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\" (UID: \"b5abd453-0ae9-420c-92b5-84b76e1b4a6a\") " Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.459081 4941 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.459105 4941 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.459117 4941 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.459161 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b5abd453-0ae9-420c-92b5-84b76e1b4a6a" (UID: "b5abd453-0ae9-420c-92b5-84b76e1b4a6a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.459312 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-var-lock" (OuterVolumeSpecName: "var-lock") pod "b5abd453-0ae9-420c-92b5-84b76e1b4a6a" (UID: "b5abd453-0ae9-420c-92b5-84b76e1b4a6a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.465590 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b5abd453-0ae9-420c-92b5-84b76e1b4a6a" (UID: "b5abd453-0ae9-420c-92b5-84b76e1b4a6a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.561824 4941 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.561874 4941 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.561894 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5abd453-0ae9-420c-92b5-84b76e1b4a6a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.962848 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.969207 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b5abd453-0ae9-420c-92b5-84b76e1b4a6a","Type":"ContainerDied","Data":"1e11e91ea2085cccd761445af3c67ae0a50b428b64b88bdf9674f2fa79b166e2"} Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.969261 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e11e91ea2085cccd761445af3c67ae0a50b428b64b88bdf9674f2fa79b166e2" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.969325 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.974186 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.974919 4941 scope.go:117] "RemoveContainer" containerID="87d8837aebf3b632f1bf3ac3d391461126d28da285eef2e6927d65d720b11f9a" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.975002 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.976377 4941 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.976627 4941 status_manager.go:851] "Failed to get status for pod" podUID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.981737 4941 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.982172 4941 status_manager.go:851] "Failed to get status for pod" podUID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.992135 4941 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.992377 4941 status_manager.go:851] "Failed to get status for pod" podUID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:45 crc kubenswrapper[4941]: I0307 06:56:45.994614 4941 scope.go:117] "RemoveContainer" containerID="3f50f6e0ff3fdabf433f5250331c152dc86dd26dc69007b5b27ccb2f1123184f" Mar 07 06:56:46 crc kubenswrapper[4941]: I0307 06:56:46.010782 4941 scope.go:117] "RemoveContainer" containerID="479aa006a3269d36b9699bc127cb0f58ea8c98ce58e7253ad8f8e46f3d3fee96" Mar 07 06:56:46 crc kubenswrapper[4941]: I0307 06:56:46.027545 4941 scope.go:117] "RemoveContainer" containerID="c8b8486b7dc1030ea0b0429126f3c878d8cabd80645391dda26d3cc6492e9148" Mar 07 06:56:46 crc kubenswrapper[4941]: I0307 06:56:46.043162 4941 scope.go:117] "RemoveContainer" containerID="a909087c5e7f9b597738d4b74e2e7ef18068cb17b9153be0d5d5358ae1fb2b76" Mar 07 06:56:46 crc kubenswrapper[4941]: I0307 06:56:46.058778 4941 scope.go:117] "RemoveContainer" containerID="0011dd04731077af484dcf6c9a9b10dc1890785677b56fa120aa2743ece9e9bb" Mar 07 06:56:48 crc kubenswrapper[4941]: E0307 06:56:48.667764 4941 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:48 crc kubenswrapper[4941]: E0307 06:56:48.668809 4941 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:48 crc kubenswrapper[4941]: E0307 06:56:48.669468 4941 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:48 crc kubenswrapper[4941]: E0307 06:56:48.669931 4941 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:48 crc kubenswrapper[4941]: E0307 06:56:48.670465 4941 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:48 crc kubenswrapper[4941]: I0307 06:56:48.670534 4941 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 07 06:56:48 crc kubenswrapper[4941]: E0307 06:56:48.671356 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="200ms" Mar 07 06:56:48 crc kubenswrapper[4941]: E0307 06:56:48.873342 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="400ms" Mar 07 06:56:49 crc kubenswrapper[4941]: E0307 06:56:49.274857 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="800ms" Mar 07 06:56:50 crc kubenswrapper[4941]: E0307 06:56:50.076525 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="1.6s" Mar 07 06:56:51 crc kubenswrapper[4941]: E0307 06:56:51.678277 4941 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="3.2s" Mar 07 06:56:52 crc kubenswrapper[4941]: E0307 06:56:52.800241 4941 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189a7cc84603cb8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:56:42.93366875 +0000 UTC m=+299.886034215,LastTimestamp:2026-03-07 06:56:42.93366875 +0000 UTC m=+299.886034215,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:56:52 crc kubenswrapper[4941]: I0307 06:56:52.953795 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:52 crc kubenswrapper[4941]: I0307 06:56:52.954742 4941 status_manager.go:851] "Failed to get status for pod" podUID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:52 crc kubenswrapper[4941]: E0307 06:56:52.974657 4941 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" volumeName="registry-storage" Mar 07 06:56:52 crc kubenswrapper[4941]: I0307 06:56:52.979996 4941 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f" Mar 07 06:56:52 crc kubenswrapper[4941]: I0307 06:56:52.980037 4941 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f" Mar 07 06:56:52 crc kubenswrapper[4941]: E0307 06:56:52.980768 4941 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:52 crc kubenswrapper[4941]: I0307 06:56:52.981646 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:53 crc kubenswrapper[4941]: W0307 06:56:53.023770 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-c30cc987e84e9ee6f3162c73385dfb4429be8585286a098debd6e291d109f0f2 WatchSource:0}: Error finding container c30cc987e84e9ee6f3162c73385dfb4429be8585286a098debd6e291d109f0f2: Status 404 returned error can't find the container with id c30cc987e84e9ee6f3162c73385dfb4429be8585286a098debd6e291d109f0f2 Mar 07 06:56:53 crc kubenswrapper[4941]: I0307 06:56:53.959213 4941 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:53 crc kubenswrapper[4941]: I0307 06:56:53.960534 4941 status_manager.go:851] "Failed to get status for pod" podUID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:54 crc kubenswrapper[4941]: I0307 06:56:54.037754 4941 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="dc6872c6f65b0781af08691e402bfe0d98a2c13a08ff65bbab52864c8d892448" exitCode=0 Mar 07 06:56:54 crc kubenswrapper[4941]: I0307 06:56:54.037804 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"dc6872c6f65b0781af08691e402bfe0d98a2c13a08ff65bbab52864c8d892448"} Mar 07 06:56:54 crc kubenswrapper[4941]: I0307 06:56:54.037834 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c30cc987e84e9ee6f3162c73385dfb4429be8585286a098debd6e291d109f0f2"} Mar 07 06:56:54 crc kubenswrapper[4941]: I0307 06:56:54.038107 4941 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f" Mar 07 06:56:54 crc kubenswrapper[4941]: I0307 06:56:54.038121 4941 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f" Mar 07 06:56:54 crc kubenswrapper[4941]: E0307 06:56:54.038395 4941 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:54 crc kubenswrapper[4941]: I0307 06:56:54.038449 4941 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:54 crc kubenswrapper[4941]: I0307 06:56:54.038780 4941 status_manager.go:851] "Failed to get status for pod" podUID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 07 06:56:55 crc kubenswrapper[4941]: I0307 06:56:55.056656 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cf301248c4b49cc2a80f92ee9fd1143f7166caffeff1d6d6ddee2ffd82061565"} Mar 07 06:56:55 crc kubenswrapper[4941]: I0307 06:56:55.057152 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"797967b26f5e269f2ca7b620e8eabe611c149a2280c970a5ba42fb20b47faf61"} Mar 07 06:56:55 crc kubenswrapper[4941]: I0307 06:56:55.057166 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bbee97d6fc82d703088bc1b911fd235a6a7672b23199b4f0265e9ed984fc1d6d"} Mar 07 06:56:55 crc kubenswrapper[4941]: I0307 06:56:55.057177 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c05fd2aadf6f3ec395fb185d3d3e0e7335e75ef66f74f59cd892d3e92952c1b0"} Mar 07 06:56:56 crc kubenswrapper[4941]: I0307 06:56:56.065777 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"02ca8ba7879d6aee7238ac528d86f9568fc689c974a9606bbd4ac030a08df48f"} Mar 07 06:56:56 crc kubenswrapper[4941]: I0307 06:56:56.066858 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:56 crc kubenswrapper[4941]: I0307 06:56:56.066130 4941 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f" Mar 07 06:56:56 crc kubenswrapper[4941]: I0307 06:56:56.067066 4941 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f" Mar 07 06:56:57 crc kubenswrapper[4941]: I0307 06:56:57.076013 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 07 06:56:57 crc kubenswrapper[4941]: I0307 06:56:57.077168 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 07 06:56:57 crc kubenswrapper[4941]: I0307 06:56:57.077237 4941 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d" exitCode=1 Mar 07 06:56:57 crc kubenswrapper[4941]: I0307 06:56:57.077280 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d"} Mar 07 06:56:57 crc kubenswrapper[4941]: I0307 06:56:57.077998 4941 scope.go:117] "RemoveContainer" containerID="eaa1e1f6735a45057d103c892e2f705486543cfaf69e175f3ed4a9ac0248779d" Mar 07 06:56:57 crc kubenswrapper[4941]: I0307 06:56:57.982200 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:57 crc kubenswrapper[4941]: I0307 06:56:57.982288 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:57 crc kubenswrapper[4941]: I0307 06:56:57.989276 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:56:58 crc kubenswrapper[4941]: I0307 06:56:58.086901 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 07 06:56:58 crc kubenswrapper[4941]: I0307 06:56:58.087607 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 07 06:56:58 crc kubenswrapper[4941]: I0307 06:56:58.087681 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9557356007a2f4e5511d83b2daace406deffd8183919bec28f6a71c9381d974c"} Mar 07 06:56:59 crc kubenswrapper[4941]: I0307 06:56:59.586381 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:56:59 crc kubenswrapper[4941]: I0307 06:56:59.592432 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:57:00 crc kubenswrapper[4941]: I0307 06:57:00.101049 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:57:01 crc kubenswrapper[4941]: I0307 06:57:01.081693 4941 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:57:01 crc kubenswrapper[4941]: I0307 06:57:01.105648 4941 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:56:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:56:54Z\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T06:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c05fd2aadf6f3ec395fb185d3d3e0e7335e75ef66f74f59cd892d3e92952c1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:56:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://797967b26f5e269f2ca7b620e8eabe611c149a2280c970a5ba42fb20b47faf61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:56:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbee97d6fc82d703088bc1b911fd235a6a7672b23199b4f0265e9ed984fc1d6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:56:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ca8ba7879d6aee7238ac528d86f9568fc689c974a9606bbd4ac030a08df48f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:56:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf301248c4b49cc2a80f92ee9fd1143f7166caffeff1d6d6ddee2ffd82061565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:56:55Z\\\"}}}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6872c6f65b0781af08691e402bfe0d98a2c13a08ff65bbab52864c8d892448\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6872c6f65b0781af08691e402bfe0d98a2c13a08ff65bbab52864c8d892448\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:56:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:56:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}]}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Pod \"kube-apiserver-crc\" is invalid: metadata.uid: Invalid value: \"e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f\": field is immutable" Mar 07 06:57:01 crc kubenswrapper[4941]: I0307 06:57:01.105921 4941 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f" Mar 07 06:57:01 crc kubenswrapper[4941]: I0307 06:57:01.105966 4941 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f" Mar 07 06:57:01 crc kubenswrapper[4941]: I0307 06:57:01.109675 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:57:01 crc kubenswrapper[4941]: I0307 06:57:01.122348 4941 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d9200f09-b964-467f-b65e-eb57ab478edd" Mar 07 06:57:02 crc kubenswrapper[4941]: I0307 06:57:02.113122 4941 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f" Mar 07 06:57:02 crc kubenswrapper[4941]: I0307 06:57:02.113724 4941 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f" Mar 07 06:57:02 crc kubenswrapper[4941]: I0307 06:57:02.117759 4941 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d9200f09-b964-467f-b65e-eb57ab478edd" Mar 07 06:57:07 crc kubenswrapper[4941]: I0307 06:57:07.343231 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 07 06:57:07 crc kubenswrapper[4941]: I0307 06:57:07.668172 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 07 06:57:08 crc kubenswrapper[4941]: I0307 06:57:08.463391 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 07 06:57:08 crc kubenswrapper[4941]: I0307 06:57:08.523837 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 07 06:57:08 crc kubenswrapper[4941]: I0307 06:57:08.875366 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 07 06:57:09 crc kubenswrapper[4941]: I0307 06:57:09.371714 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 07 06:57:09 crc kubenswrapper[4941]: I0307 06:57:09.439809 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 06:57:09 crc kubenswrapper[4941]: I0307 06:57:09.742326 4941 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 07 06:57:09 crc kubenswrapper[4941]: I0307 06:57:09.748536 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 07 06:57:09 crc kubenswrapper[4941]: I0307 06:57:09.748602 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 07 06:57:09 crc kubenswrapper[4941]: I0307 06:57:09.748945 4941 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f" Mar 07 06:57:09 crc kubenswrapper[4941]: I0307 06:57:09.748976 4941 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5b6a1cd-73ac-4dfa-a3e2-486b20ac999f" Mar 07 06:57:09 crc kubenswrapper[4941]: I0307 06:57:09.752555 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:57:09 crc kubenswrapper[4941]: I0307 06:57:09.773506 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=8.773486188 podStartE2EDuration="8.773486188s" podCreationTimestamp="2026-03-07 06:57:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:57:09.768397133 +0000 UTC m=+326.720762608" watchObservedRunningTime="2026-03-07 06:57:09.773486188 +0000 UTC m=+326.725851663" Mar 07 06:57:09 crc kubenswrapper[4941]: I0307 06:57:09.857679 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 07 06:57:09 crc kubenswrapper[4941]: I0307 06:57:09.993128 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 07 06:57:09 crc kubenswrapper[4941]: I0307 06:57:09.999615 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 07 06:57:10 crc kubenswrapper[4941]: I0307 06:57:10.351433 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 07 06:57:10 crc kubenswrapper[4941]: I0307 06:57:10.570473 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 07 06:57:11 crc kubenswrapper[4941]: I0307 06:57:11.979770 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:57:12 crc kubenswrapper[4941]: I0307 06:57:12.143222 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 06:57:12 crc kubenswrapper[4941]: I0307 06:57:12.189375 4941 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 07 06:57:12 crc kubenswrapper[4941]: I0307 06:57:12.189640 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://007afaa50d01ddf1e2b67f5cae2f3737649123512558d44d885b01bca4fef43b" gracePeriod=5 Mar 07 06:57:12 crc kubenswrapper[4941]: I0307 06:57:12.307748 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 07 06:57:12 crc kubenswrapper[4941]: I0307 06:57:12.452624 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 07 06:57:13 crc kubenswrapper[4941]: I0307 06:57:13.016623 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 07 06:57:13 crc kubenswrapper[4941]: I0307 06:57:13.237122 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 07 06:57:13 crc kubenswrapper[4941]: I0307 06:57:13.254548 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 07 06:57:13 crc kubenswrapper[4941]: I0307 06:57:13.274874 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 07 06:57:13 crc kubenswrapper[4941]: I0307 06:57:13.567298 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 07 06:57:13 crc kubenswrapper[4941]: I0307 06:57:13.822788 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 07 06:57:13 crc kubenswrapper[4941]: I0307 06:57:13.927319 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 07 06:57:14 crc kubenswrapper[4941]: I0307 06:57:14.153119 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 07 06:57:14 crc kubenswrapper[4941]: I0307 06:57:14.738895 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 07 06:57:14 crc kubenswrapper[4941]: I0307 06:57:14.920762 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 07 06:57:14 crc kubenswrapper[4941]: I0307 06:57:14.929902 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 06:57:14 crc kubenswrapper[4941]: I0307 06:57:14.996668 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 07 06:57:15 crc kubenswrapper[4941]: I0307 06:57:15.036651 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 07 06:57:15 crc kubenswrapper[4941]: I0307 06:57:15.319528 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 07 06:57:15 crc kubenswrapper[4941]: I0307 06:57:15.421344 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 07 06:57:15 crc kubenswrapper[4941]: I0307 06:57:15.511505 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 07 06:57:15 crc kubenswrapper[4941]: I0307 06:57:15.610298 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 07 06:57:15 crc kubenswrapper[4941]: I0307 06:57:15.662270 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 07 06:57:15 crc kubenswrapper[4941]: I0307 06:57:15.752474 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 07 06:57:15 crc kubenswrapper[4941]: I0307 06:57:15.794656 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 07 06:57:15 crc kubenswrapper[4941]: I0307 06:57:15.822997 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 07 06:57:16 crc kubenswrapper[4941]: I0307 06:57:16.037190 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 07 06:57:16 crc kubenswrapper[4941]: I0307 06:57:16.144880 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 07 06:57:16 crc kubenswrapper[4941]: I0307 06:57:16.450666 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 07 06:57:16 crc kubenswrapper[4941]: I0307 06:57:16.600174 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 07 06:57:16 crc kubenswrapper[4941]: I0307 06:57:16.664252 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 07 06:57:16 crc kubenswrapper[4941]: I0307 06:57:16.753975 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 07 06:57:16 crc kubenswrapper[4941]: I0307 06:57:16.778717 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 07 06:57:16 crc kubenswrapper[4941]: I0307 06:57:16.857865 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 07 06:57:16 crc kubenswrapper[4941]: I0307 06:57:16.999111 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.033294 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.099031 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.110579 4941 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.120743 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.223581 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.230947 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.251089 4941 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.261250 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.275364 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.322283 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.466579 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.608464 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.627460 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.691764 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.775622 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.792229 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.833730 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.833859 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.903987 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.977333 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.977393 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.977485 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.977523 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.977599 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.977665 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.977721 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.977822 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.978123 4941 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.978151 4941 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.978165 4941 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.978200 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:57:17 crc kubenswrapper[4941]: I0307 06:57:17.988926 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.002886 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.031130 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.041471 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.079879 4941 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.079947 4941 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.122252 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.143088 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.226893 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.233604 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.233661 4941 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="007afaa50d01ddf1e2b67f5cae2f3737649123512558d44d885b01bca4fef43b" exitCode=137 Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.233708 4941 scope.go:117] "RemoveContainer" containerID="007afaa50d01ddf1e2b67f5cae2f3737649123512558d44d885b01bca4fef43b" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.233792 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.263688 4941 scope.go:117] "RemoveContainer" containerID="007afaa50d01ddf1e2b67f5cae2f3737649123512558d44d885b01bca4fef43b" Mar 07 06:57:18 crc kubenswrapper[4941]: E0307 06:57:18.264285 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"007afaa50d01ddf1e2b67f5cae2f3737649123512558d44d885b01bca4fef43b\": container with ID starting with 007afaa50d01ddf1e2b67f5cae2f3737649123512558d44d885b01bca4fef43b not found: ID does not exist" containerID="007afaa50d01ddf1e2b67f5cae2f3737649123512558d44d885b01bca4fef43b" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.264361 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"007afaa50d01ddf1e2b67f5cae2f3737649123512558d44d885b01bca4fef43b"} err="failed to get container status \"007afaa50d01ddf1e2b67f5cae2f3737649123512558d44d885b01bca4fef43b\": rpc error: code = NotFound desc = could not find container \"007afaa50d01ddf1e2b67f5cae2f3737649123512558d44d885b01bca4fef43b\": container with ID starting with 007afaa50d01ddf1e2b67f5cae2f3737649123512558d44d885b01bca4fef43b not found: ID does not exist" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.275889 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.355499 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.398710 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.409621 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.446147 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.608095 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.623035 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.653442 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.663093 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.702716 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.707068 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.736426 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.790149 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.804268 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.862316 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.866548 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.882697 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.913661 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.915189 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.958566 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 07 06:57:18 crc kubenswrapper[4941]: I0307 06:57:18.976614 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.002236 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.071265 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.235366 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.270050 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.273474 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.326084 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.375197 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.375453 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.491471 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.582694 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.607149 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.703909 4941 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.768766 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.768766 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.871201 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.938997 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.960639 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.961644 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 07 06:57:19 crc kubenswrapper[4941]: I0307 06:57:19.984009 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.008368 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.051027 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.132691 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.153471 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.208884 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.210592 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.246298 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.322654 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.372762 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.513660 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.711201 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.793388 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.814511 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.830774 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.842228 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.846105 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.878482 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.896370 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 07 06:57:20 crc kubenswrapper[4941]: I0307 06:57:20.908739 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.007725 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.141086 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.205546 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.210921 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.244260 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.262083 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.372950 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.417930 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.420281 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.436091 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.522859 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.548715 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.567550 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.650523 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.701075 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.726708 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.737471 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.820104 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.923812 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.941951 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 06:57:21 crc kubenswrapper[4941]: I0307 06:57:21.964949 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.088065 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.115669 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.163665 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.208417 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.236284 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.254699 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.496568 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.535308 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.601145 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.663993 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.708365 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.710632 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.773756 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.778180 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.876207 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 07 06:57:22 crc kubenswrapper[4941]: I0307 06:57:22.943944 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.027181 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.038876 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.136387 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.143008 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.191531 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.246086 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.260268 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.282877 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.400495 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.534078 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.577692 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.661988 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.683715 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.759104 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.797294 4941 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.875123 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 07 06:57:23 crc kubenswrapper[4941]: I0307 06:57:23.975761 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.026486 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.309102 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.341249 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.411192 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.440164 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.505363 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.537075 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.641739 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.674480 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.675216 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.695689 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.696672 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.697467 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.740964 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.778671 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.903743 4941 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 07 06:57:24 crc kubenswrapper[4941]: I0307 06:57:24.940867 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.015011 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.019022 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.051340 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.133937 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.224570 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.370488 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.415079 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.662383 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.745678 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.893633 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.938122 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.955321 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.976802 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 07 06:57:25 crc kubenswrapper[4941]: I0307 06:57:25.995602 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 07 06:57:26 crc kubenswrapper[4941]: I0307 06:57:26.067534 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 07 06:57:26 crc kubenswrapper[4941]: I0307 06:57:26.072652 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 07 06:57:26 crc kubenswrapper[4941]: I0307 06:57:26.151666 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 07 06:57:26 crc kubenswrapper[4941]: I0307 06:57:26.165028 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 07 06:57:26 crc kubenswrapper[4941]: I0307 06:57:26.268301 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 07 06:57:26 crc kubenswrapper[4941]: I0307 06:57:26.384007 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 07 06:57:26 crc kubenswrapper[4941]: I0307 06:57:26.407771 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 07 06:57:26 crc kubenswrapper[4941]: I0307 06:57:26.463365 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 07 06:57:26 crc kubenswrapper[4941]: I0307 06:57:26.525923 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 07 06:57:26 crc kubenswrapper[4941]: I0307 06:57:26.639365 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 06:57:26 crc kubenswrapper[4941]: I0307 06:57:26.668827 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 06:57:26 crc kubenswrapper[4941]: I0307 06:57:26.697648 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 07 06:57:26 crc kubenswrapper[4941]: I0307 06:57:26.722106 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.027134 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.146700 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.165613 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.223847 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.227182 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.259268 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.276578 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.306065 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.322115 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.484737 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.544646 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.617468 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.682207 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.809452 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.835127 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 07 06:57:27 crc kubenswrapper[4941]: I0307 06:57:27.866973 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 07 06:57:28 crc kubenswrapper[4941]: I0307 06:57:28.352273 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 07 06:57:28 crc kubenswrapper[4941]: I0307 06:57:28.377782 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 06:57:28 crc kubenswrapper[4941]: I0307 06:57:28.461217 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 07 06:57:28 crc kubenswrapper[4941]: I0307 06:57:28.716968 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 07 06:57:29 crc kubenswrapper[4941]: I0307 06:57:29.228325 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 06:57:29 crc kubenswrapper[4941]: I0307 06:57:29.384083 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 07 06:57:29 crc kubenswrapper[4941]: I0307 06:57:29.395913 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 07 06:57:29 crc kubenswrapper[4941]: I0307 06:57:29.640591 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 07 06:57:30 crc kubenswrapper[4941]: I0307 06:57:30.001134 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 07 06:57:30 crc kubenswrapper[4941]: I0307 06:57:30.647202 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 07 06:57:30 crc kubenswrapper[4941]: I0307 06:57:30.830219 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.865757 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z4zqp"] Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.866999 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z4zqp" podUID="805b56ac-66fd-4704-adb1-f3968f17f835" containerName="registry-server" containerID="cri-o://7bf7f4695cd43c0565ac5f6b74ad1e3889ca846037a49295da8a36a319c304e1" gracePeriod=30 Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.872616 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qx5dk"] Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.872980 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qx5dk" podUID="bdb71b40-ad9b-405b-a178-158109d65a92" containerName="registry-server" containerID="cri-o://e004e0f089b7ba7eb1801ec7f91f0a05195695ac76747a0d6e8e7433e09dcb2b" gracePeriod=30 Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.877826 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7x6zc"] Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.878102 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" podUID="baf7bbe6-5859-4df3-9164-a62bb2333078" containerName="marketplace-operator" containerID="cri-o://00dd23d15607af828fd94937288ed330bed79fbe736af6984249875b6aa4eb04" gracePeriod=30 Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.887933 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v5x2"] Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.888354 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5v5x2" podUID="a10e3708-a476-4698-aa8d-ba99a795524a" containerName="registry-server" containerID="cri-o://61051f5c8b80c92ee631a503d514c20a48781547bdf5fd70e1ba8264116986fa" gracePeriod=30 Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.896306 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g4dmh"] Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.897019 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g4dmh" podUID="36212ca9-755e-4104-a203-7c136afbfca9" containerName="registry-server" containerID="cri-o://c336ec75fb6e373e93af422f7d66b047b8f9175277903903ad6a67be085eb0fe" gracePeriod=30 Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.909995 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ktp7f"] Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.910316 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ktp7f" podUID="d003569d-8946-47e7-adf2-5148ca8de944" containerName="registry-server" containerID="cri-o://131db25e24bc1e94bca46dfac0d77253c12d4fb72dfb909ae71564be9fafd63d" gracePeriod=30 Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.931422 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l9krl"] Mar 07 06:57:43 crc kubenswrapper[4941]: E0307 06:57:43.931718 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.931744 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 07 06:57:43 crc kubenswrapper[4941]: E0307 06:57:43.931779 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" containerName="installer" Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.931792 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" containerName="installer" Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.931911 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5abd453-0ae9-420c-92b5-84b76e1b4a6a" containerName="installer" Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.931930 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.932481 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.943051 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l9krl"] Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.976262 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6683fc4-e18a-403c-a62a-0c451060c844-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l9krl\" (UID: \"f6683fc4-e18a-403c-a62a-0c451060c844\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.977056 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f6683fc4-e18a-403c-a62a-0c451060c844-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l9krl\" (UID: \"f6683fc4-e18a-403c-a62a-0c451060c844\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" Mar 07 06:57:43 crc kubenswrapper[4941]: I0307 06:57:43.977169 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gngd\" (UniqueName: \"kubernetes.io/projected/f6683fc4-e18a-403c-a62a-0c451060c844-kube-api-access-5gngd\") pod \"marketplace-operator-79b997595-l9krl\" (UID: \"f6683fc4-e18a-403c-a62a-0c451060c844\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.078812 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gngd\" (UniqueName: \"kubernetes.io/projected/f6683fc4-e18a-403c-a62a-0c451060c844-kube-api-access-5gngd\") pod \"marketplace-operator-79b997595-l9krl\" (UID: \"f6683fc4-e18a-403c-a62a-0c451060c844\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.078903 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6683fc4-e18a-403c-a62a-0c451060c844-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l9krl\" (UID: \"f6683fc4-e18a-403c-a62a-0c451060c844\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.078928 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f6683fc4-e18a-403c-a62a-0c451060c844-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l9krl\" (UID: \"f6683fc4-e18a-403c-a62a-0c451060c844\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.082758 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6683fc4-e18a-403c-a62a-0c451060c844-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l9krl\" (UID: \"f6683fc4-e18a-403c-a62a-0c451060c844\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.110283 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f6683fc4-e18a-403c-a62a-0c451060c844-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l9krl\" (UID: \"f6683fc4-e18a-403c-a62a-0c451060c844\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.113641 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gngd\" (UniqueName: \"kubernetes.io/projected/f6683fc4-e18a-403c-a62a-0c451060c844-kube-api-access-5gngd\") pod \"marketplace-operator-79b997595-l9krl\" (UID: \"f6683fc4-e18a-403c-a62a-0c451060c844\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.405936 4941 generic.go:334] "Generic (PLEG): container finished" podID="805b56ac-66fd-4704-adb1-f3968f17f835" containerID="7bf7f4695cd43c0565ac5f6b74ad1e3889ca846037a49295da8a36a319c304e1" exitCode=0 Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.406570 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4zqp" event={"ID":"805b56ac-66fd-4704-adb1-f3968f17f835","Type":"ContainerDied","Data":"7bf7f4695cd43c0565ac5f6b74ad1e3889ca846037a49295da8a36a319c304e1"} Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.406614 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4zqp" event={"ID":"805b56ac-66fd-4704-adb1-f3968f17f835","Type":"ContainerDied","Data":"d781c5e24e57e991e7373b372f99c3036fbf6041cf926262dcdddba61dae5240"} Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.406629 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d781c5e24e57e991e7373b372f99c3036fbf6041cf926262dcdddba61dae5240" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.410512 4941 generic.go:334] "Generic (PLEG): container finished" podID="baf7bbe6-5859-4df3-9164-a62bb2333078" containerID="00dd23d15607af828fd94937288ed330bed79fbe736af6984249875b6aa4eb04" exitCode=0 Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.410574 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" event={"ID":"baf7bbe6-5859-4df3-9164-a62bb2333078","Type":"ContainerDied","Data":"00dd23d15607af828fd94937288ed330bed79fbe736af6984249875b6aa4eb04"} Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.414046 4941 generic.go:334] "Generic (PLEG): container finished" podID="a10e3708-a476-4698-aa8d-ba99a795524a" containerID="61051f5c8b80c92ee631a503d514c20a48781547bdf5fd70e1ba8264116986fa" exitCode=0 Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.414096 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v5x2" event={"ID":"a10e3708-a476-4698-aa8d-ba99a795524a","Type":"ContainerDied","Data":"61051f5c8b80c92ee631a503d514c20a48781547bdf5fd70e1ba8264116986fa"} Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.416261 4941 generic.go:334] "Generic (PLEG): container finished" podID="36212ca9-755e-4104-a203-7c136afbfca9" containerID="c336ec75fb6e373e93af422f7d66b047b8f9175277903903ad6a67be085eb0fe" exitCode=0 Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.416301 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g4dmh" event={"ID":"36212ca9-755e-4104-a203-7c136afbfca9","Type":"ContainerDied","Data":"c336ec75fb6e373e93af422f7d66b047b8f9175277903903ad6a67be085eb0fe"} Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.418106 4941 generic.go:334] "Generic (PLEG): container finished" podID="bdb71b40-ad9b-405b-a178-158109d65a92" containerID="e004e0f089b7ba7eb1801ec7f91f0a05195695ac76747a0d6e8e7433e09dcb2b" exitCode=0 Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.418153 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qx5dk" event={"ID":"bdb71b40-ad9b-405b-a178-158109d65a92","Type":"ContainerDied","Data":"e004e0f089b7ba7eb1801ec7f91f0a05195695ac76747a0d6e8e7433e09dcb2b"} Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.419595 4941 generic.go:334] "Generic (PLEG): container finished" podID="d003569d-8946-47e7-adf2-5148ca8de944" containerID="131db25e24bc1e94bca46dfac0d77253c12d4fb72dfb909ae71564be9fafd63d" exitCode=0 Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.419625 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktp7f" event={"ID":"d003569d-8946-47e7-adf2-5148ca8de944","Type":"ContainerDied","Data":"131db25e24bc1e94bca46dfac0d77253c12d4fb72dfb909ae71564be9fafd63d"} Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.453126 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.476857 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.477168 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.496287 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.517050 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.523273 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.526958 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.585117 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/baf7bbe6-5859-4df3-9164-a62bb2333078-marketplace-operator-metrics\") pod \"baf7bbe6-5859-4df3-9164-a62bb2333078\" (UID: \"baf7bbe6-5859-4df3-9164-a62bb2333078\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.585161 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cs5v\" (UniqueName: \"kubernetes.io/projected/805b56ac-66fd-4704-adb1-f3968f17f835-kube-api-access-4cs5v\") pod \"805b56ac-66fd-4704-adb1-f3968f17f835\" (UID: \"805b56ac-66fd-4704-adb1-f3968f17f835\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.585204 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805b56ac-66fd-4704-adb1-f3968f17f835-catalog-content\") pod \"805b56ac-66fd-4704-adb1-f3968f17f835\" (UID: \"805b56ac-66fd-4704-adb1-f3968f17f835\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.585235 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/baf7bbe6-5859-4df3-9164-a62bb2333078-marketplace-trusted-ca\") pod \"baf7bbe6-5859-4df3-9164-a62bb2333078\" (UID: \"baf7bbe6-5859-4df3-9164-a62bb2333078\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.585256 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805b56ac-66fd-4704-adb1-f3968f17f835-utilities\") pod \"805b56ac-66fd-4704-adb1-f3968f17f835\" (UID: \"805b56ac-66fd-4704-adb1-f3968f17f835\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.585283 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l9gg\" (UniqueName: \"kubernetes.io/projected/baf7bbe6-5859-4df3-9164-a62bb2333078-kube-api-access-9l9gg\") pod \"baf7bbe6-5859-4df3-9164-a62bb2333078\" (UID: \"baf7bbe6-5859-4df3-9164-a62bb2333078\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.587276 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baf7bbe6-5859-4df3-9164-a62bb2333078-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "baf7bbe6-5859-4df3-9164-a62bb2333078" (UID: "baf7bbe6-5859-4df3-9164-a62bb2333078"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.587634 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/805b56ac-66fd-4704-adb1-f3968f17f835-utilities" (OuterVolumeSpecName: "utilities") pod "805b56ac-66fd-4704-adb1-f3968f17f835" (UID: "805b56ac-66fd-4704-adb1-f3968f17f835"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.594683 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/805b56ac-66fd-4704-adb1-f3968f17f835-kube-api-access-4cs5v" (OuterVolumeSpecName: "kube-api-access-4cs5v") pod "805b56ac-66fd-4704-adb1-f3968f17f835" (UID: "805b56ac-66fd-4704-adb1-f3968f17f835"). InnerVolumeSpecName "kube-api-access-4cs5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.595582 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf7bbe6-5859-4df3-9164-a62bb2333078-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "baf7bbe6-5859-4df3-9164-a62bb2333078" (UID: "baf7bbe6-5859-4df3-9164-a62bb2333078"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.598805 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf7bbe6-5859-4df3-9164-a62bb2333078-kube-api-access-9l9gg" (OuterVolumeSpecName: "kube-api-access-9l9gg") pod "baf7bbe6-5859-4df3-9164-a62bb2333078" (UID: "baf7bbe6-5859-4df3-9164-a62bb2333078"). InnerVolumeSpecName "kube-api-access-9l9gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686106 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb71b40-ad9b-405b-a178-158109d65a92-catalog-content\") pod \"bdb71b40-ad9b-405b-a178-158109d65a92\" (UID: \"bdb71b40-ad9b-405b-a178-158109d65a92\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686154 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d003569d-8946-47e7-adf2-5148ca8de944-catalog-content\") pod \"d003569d-8946-47e7-adf2-5148ca8de944\" (UID: \"d003569d-8946-47e7-adf2-5148ca8de944\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686237 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5lmp\" (UniqueName: \"kubernetes.io/projected/d003569d-8946-47e7-adf2-5148ca8de944-kube-api-access-r5lmp\") pod \"d003569d-8946-47e7-adf2-5148ca8de944\" (UID: \"d003569d-8946-47e7-adf2-5148ca8de944\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686276 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb71b40-ad9b-405b-a178-158109d65a92-utilities\") pod \"bdb71b40-ad9b-405b-a178-158109d65a92\" (UID: \"bdb71b40-ad9b-405b-a178-158109d65a92\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686306 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-254k7\" (UniqueName: \"kubernetes.io/projected/bdb71b40-ad9b-405b-a178-158109d65a92-kube-api-access-254k7\") pod \"bdb71b40-ad9b-405b-a178-158109d65a92\" (UID: \"bdb71b40-ad9b-405b-a178-158109d65a92\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686326 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mz8n\" (UniqueName: \"kubernetes.io/projected/a10e3708-a476-4698-aa8d-ba99a795524a-kube-api-access-4mz8n\") pod \"a10e3708-a476-4698-aa8d-ba99a795524a\" (UID: \"a10e3708-a476-4698-aa8d-ba99a795524a\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686355 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k5mh\" (UniqueName: \"kubernetes.io/projected/36212ca9-755e-4104-a203-7c136afbfca9-kube-api-access-2k5mh\") pod \"36212ca9-755e-4104-a203-7c136afbfca9\" (UID: \"36212ca9-755e-4104-a203-7c136afbfca9\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686378 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d003569d-8946-47e7-adf2-5148ca8de944-utilities\") pod \"d003569d-8946-47e7-adf2-5148ca8de944\" (UID: \"d003569d-8946-47e7-adf2-5148ca8de944\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686413 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a10e3708-a476-4698-aa8d-ba99a795524a-catalog-content\") pod \"a10e3708-a476-4698-aa8d-ba99a795524a\" (UID: \"a10e3708-a476-4698-aa8d-ba99a795524a\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686467 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36212ca9-755e-4104-a203-7c136afbfca9-catalog-content\") pod \"36212ca9-755e-4104-a203-7c136afbfca9\" (UID: \"36212ca9-755e-4104-a203-7c136afbfca9\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686489 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a10e3708-a476-4698-aa8d-ba99a795524a-utilities\") pod \"a10e3708-a476-4698-aa8d-ba99a795524a\" (UID: \"a10e3708-a476-4698-aa8d-ba99a795524a\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686518 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36212ca9-755e-4104-a203-7c136afbfca9-utilities\") pod \"36212ca9-755e-4104-a203-7c136afbfca9\" (UID: \"36212ca9-755e-4104-a203-7c136afbfca9\") " Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686785 4941 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/baf7bbe6-5859-4df3-9164-a62bb2333078-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.686803 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cs5v\" (UniqueName: \"kubernetes.io/projected/805b56ac-66fd-4704-adb1-f3968f17f835-kube-api-access-4cs5v\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.687086 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/805b56ac-66fd-4704-adb1-f3968f17f835-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "805b56ac-66fd-4704-adb1-f3968f17f835" (UID: "805b56ac-66fd-4704-adb1-f3968f17f835"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.687773 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36212ca9-755e-4104-a203-7c136afbfca9-utilities" (OuterVolumeSpecName: "utilities") pod "36212ca9-755e-4104-a203-7c136afbfca9" (UID: "36212ca9-755e-4104-a203-7c136afbfca9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.691556 4941 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/baf7bbe6-5859-4df3-9164-a62bb2333078-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.691597 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805b56ac-66fd-4704-adb1-f3968f17f835-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.691612 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l9gg\" (UniqueName: \"kubernetes.io/projected/baf7bbe6-5859-4df3-9164-a62bb2333078-kube-api-access-9l9gg\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.692423 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d003569d-8946-47e7-adf2-5148ca8de944-utilities" (OuterVolumeSpecName: "utilities") pod "d003569d-8946-47e7-adf2-5148ca8de944" (UID: "d003569d-8946-47e7-adf2-5148ca8de944"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.692669 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdb71b40-ad9b-405b-a178-158109d65a92-utilities" (OuterVolumeSpecName: "utilities") pod "bdb71b40-ad9b-405b-a178-158109d65a92" (UID: "bdb71b40-ad9b-405b-a178-158109d65a92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.692816 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a10e3708-a476-4698-aa8d-ba99a795524a-kube-api-access-4mz8n" (OuterVolumeSpecName: "kube-api-access-4mz8n") pod "a10e3708-a476-4698-aa8d-ba99a795524a" (UID: "a10e3708-a476-4698-aa8d-ba99a795524a"). InnerVolumeSpecName "kube-api-access-4mz8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.692972 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36212ca9-755e-4104-a203-7c136afbfca9-kube-api-access-2k5mh" (OuterVolumeSpecName: "kube-api-access-2k5mh") pod "36212ca9-755e-4104-a203-7c136afbfca9" (UID: "36212ca9-755e-4104-a203-7c136afbfca9"). InnerVolumeSpecName "kube-api-access-2k5mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.697794 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d003569d-8946-47e7-adf2-5148ca8de944-kube-api-access-r5lmp" (OuterVolumeSpecName: "kube-api-access-r5lmp") pod "d003569d-8946-47e7-adf2-5148ca8de944" (UID: "d003569d-8946-47e7-adf2-5148ca8de944"). InnerVolumeSpecName "kube-api-access-r5lmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.697879 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdb71b40-ad9b-405b-a178-158109d65a92-kube-api-access-254k7" (OuterVolumeSpecName: "kube-api-access-254k7") pod "bdb71b40-ad9b-405b-a178-158109d65a92" (UID: "bdb71b40-ad9b-405b-a178-158109d65a92"). InnerVolumeSpecName "kube-api-access-254k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.701170 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a10e3708-a476-4698-aa8d-ba99a795524a-utilities" (OuterVolumeSpecName: "utilities") pod "a10e3708-a476-4698-aa8d-ba99a795524a" (UID: "a10e3708-a476-4698-aa8d-ba99a795524a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.729581 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a10e3708-a476-4698-aa8d-ba99a795524a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a10e3708-a476-4698-aa8d-ba99a795524a" (UID: "a10e3708-a476-4698-aa8d-ba99a795524a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.768249 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdb71b40-ad9b-405b-a178-158109d65a92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdb71b40-ad9b-405b-a178-158109d65a92" (UID: "bdb71b40-ad9b-405b-a178-158109d65a92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.795372 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805b56ac-66fd-4704-adb1-f3968f17f835-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.795449 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a10e3708-a476-4698-aa8d-ba99a795524a-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.795469 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36212ca9-755e-4104-a203-7c136afbfca9-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.795485 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb71b40-ad9b-405b-a178-158109d65a92-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.795531 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5lmp\" (UniqueName: \"kubernetes.io/projected/d003569d-8946-47e7-adf2-5148ca8de944-kube-api-access-r5lmp\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.795547 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb71b40-ad9b-405b-a178-158109d65a92-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.795559 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-254k7\" (UniqueName: \"kubernetes.io/projected/bdb71b40-ad9b-405b-a178-158109d65a92-kube-api-access-254k7\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.795571 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mz8n\" (UniqueName: \"kubernetes.io/projected/a10e3708-a476-4698-aa8d-ba99a795524a-kube-api-access-4mz8n\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.795586 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k5mh\" (UniqueName: \"kubernetes.io/projected/36212ca9-755e-4104-a203-7c136afbfca9-kube-api-access-2k5mh\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.795598 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d003569d-8946-47e7-adf2-5148ca8de944-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.795609 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a10e3708-a476-4698-aa8d-ba99a795524a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.843747 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d003569d-8946-47e7-adf2-5148ca8de944-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d003569d-8946-47e7-adf2-5148ca8de944" (UID: "d003569d-8946-47e7-adf2-5148ca8de944"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.862057 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36212ca9-755e-4104-a203-7c136afbfca9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36212ca9-755e-4104-a203-7c136afbfca9" (UID: "36212ca9-755e-4104-a203-7c136afbfca9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.896856 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36212ca9-755e-4104-a203-7c136afbfca9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.896905 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d003569d-8946-47e7-adf2-5148ca8de944-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 06:57:44 crc kubenswrapper[4941]: I0307 06:57:44.935840 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l9krl"] Mar 07 06:57:44 crc kubenswrapper[4941]: W0307 06:57:44.941752 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6683fc4_e18a_403c_a62a_0c451060c844.slice/crio-1ed82df8082b9d6b5dadc86e6c08649a9406badaa429160ee23fa52ff468aec0 WatchSource:0}: Error finding container 1ed82df8082b9d6b5dadc86e6c08649a9406badaa429160ee23fa52ff468aec0: Status 404 returned error can't find the container with id 1ed82df8082b9d6b5dadc86e6c08649a9406badaa429160ee23fa52ff468aec0 Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.427220 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ktp7f" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.427310 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktp7f" event={"ID":"d003569d-8946-47e7-adf2-5148ca8de944","Type":"ContainerDied","Data":"317b1eb31db1baeb331bfbf3988ca6ad93caa39b7bd3580dd19ca28e7a532202"} Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.427926 4941 scope.go:117] "RemoveContainer" containerID="131db25e24bc1e94bca46dfac0d77253c12d4fb72dfb909ae71564be9fafd63d" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.428597 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" event={"ID":"baf7bbe6-5859-4df3-9164-a62bb2333078","Type":"ContainerDied","Data":"23b34134dbf17c1b57890da59896a149df951633d5bb2eb3656826fd43df1f99"} Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.428632 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7x6zc" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.433964 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v5x2" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.434377 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v5x2" event={"ID":"a10e3708-a476-4698-aa8d-ba99a795524a","Type":"ContainerDied","Data":"2c19b6640fd50716e9f0f5c7f5094ff34bad0f1bf25bfc08da9aa9fec332d040"} Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.436261 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g4dmh" event={"ID":"36212ca9-755e-4104-a203-7c136afbfca9","Type":"ContainerDied","Data":"ff630edcad373f9613081a2a57da9c3f0cbe2b615c43a5ae01227878d85eefca"} Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.436385 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g4dmh" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.444925 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" event={"ID":"f6683fc4-e18a-403c-a62a-0c451060c844","Type":"ContainerStarted","Data":"5cf442db3b486cf3c91a7d7fa714cd6788e499cfbe1cfb4493910853df9ce447"} Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.444990 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" event={"ID":"f6683fc4-e18a-403c-a62a-0c451060c844","Type":"ContainerStarted","Data":"1ed82df8082b9d6b5dadc86e6c08649a9406badaa429160ee23fa52ff468aec0"} Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.445156 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.447774 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4zqp" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.448581 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qx5dk" event={"ID":"bdb71b40-ad9b-405b-a178-158109d65a92","Type":"ContainerDied","Data":"6c1ec96b1642fa62a471bb9082b309cacc8b442db698be53b878d706c1ad8d95"} Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.449398 4941 scope.go:117] "RemoveContainer" containerID="d7683ad9d1dce5316d63834d4de252268c91d6465c504103cf7535ac1e1040d6" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.449534 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qx5dk" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.451994 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.480698 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-l9krl" podStartSLOduration=2.480672476 podStartE2EDuration="2.480672476s" podCreationTimestamp="2026-03-07 06:57:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:57:45.477709577 +0000 UTC m=+362.430075052" watchObservedRunningTime="2026-03-07 06:57:45.480672476 +0000 UTC m=+362.433037941" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.492649 4941 scope.go:117] "RemoveContainer" containerID="d2d3c310234855d9e00e2325c2f877b0fb93bea45fc23a8a441a51c53dddc621" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.513188 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7x6zc"] Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.519768 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7x6zc"] Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.526439 4941 scope.go:117] "RemoveContainer" containerID="00dd23d15607af828fd94937288ed330bed79fbe736af6984249875b6aa4eb04" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.533780 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qx5dk"] Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.549798 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qx5dk"] Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.553570 4941 scope.go:117] "RemoveContainer" containerID="61051f5c8b80c92ee631a503d514c20a48781547bdf5fd70e1ba8264116986fa" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.562828 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v5x2"] Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.567761 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v5x2"] Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.571239 4941 scope.go:117] "RemoveContainer" containerID="021eb75d33ce6a9734e651ac4fb1cca81f1441e309586915904911dc8dbb5b37" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.571614 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ktp7f"] Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.576090 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ktp7f"] Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.579863 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g4dmh"] Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.584476 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g4dmh"] Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.587773 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z4zqp"] Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.588378 4941 scope.go:117] "RemoveContainer" containerID="756e5b0af3038eb53f3c1b907a19edb61aa6b2a46afc769da1b7fa1daeb32ddf" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.589178 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z4zqp"] Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.610091 4941 scope.go:117] "RemoveContainer" containerID="c336ec75fb6e373e93af422f7d66b047b8f9175277903903ad6a67be085eb0fe" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.627291 4941 scope.go:117] "RemoveContainer" containerID="172a5846ac2238be5a04b1a4c39628b678676e1d9cab2409d32d6c0195b72ba2" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.652111 4941 scope.go:117] "RemoveContainer" containerID="f212e51eddef03cf072afe8f79363a25de336b0313cab1ee4c707559d7144f03" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.666384 4941 scope.go:117] "RemoveContainer" containerID="e004e0f089b7ba7eb1801ec7f91f0a05195695ac76747a0d6e8e7433e09dcb2b" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.683685 4941 scope.go:117] "RemoveContainer" containerID="43eab2b87765e736865e75c97c7f4a343f30404449579af0a35b9ccac4bbf877" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.698443 4941 scope.go:117] "RemoveContainer" containerID="ad2c70584d1e673c0a880b6d03c0354c9189453946e60a5dfb6e744f65c96d77" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.960434 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36212ca9-755e-4104-a203-7c136afbfca9" path="/var/lib/kubelet/pods/36212ca9-755e-4104-a203-7c136afbfca9/volumes" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.961171 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="805b56ac-66fd-4704-adb1-f3968f17f835" path="/var/lib/kubelet/pods/805b56ac-66fd-4704-adb1-f3968f17f835/volumes" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.961875 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a10e3708-a476-4698-aa8d-ba99a795524a" path="/var/lib/kubelet/pods/a10e3708-a476-4698-aa8d-ba99a795524a/volumes" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.963081 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baf7bbe6-5859-4df3-9164-a62bb2333078" path="/var/lib/kubelet/pods/baf7bbe6-5859-4df3-9164-a62bb2333078/volumes" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.963591 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdb71b40-ad9b-405b-a178-158109d65a92" path="/var/lib/kubelet/pods/bdb71b40-ad9b-405b-a178-158109d65a92/volumes" Mar 07 06:57:45 crc kubenswrapper[4941]: I0307 06:57:45.964606 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d003569d-8946-47e7-adf2-5148ca8de944" path="/var/lib/kubelet/pods/d003569d-8946-47e7-adf2-5148ca8de944/volumes" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.178669 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547778-x2xv2"] Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.179803 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36212ca9-755e-4104-a203-7c136afbfca9" containerName="extract-content" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.179822 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="36212ca9-755e-4104-a203-7c136afbfca9" containerName="extract-content" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.179835 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf7bbe6-5859-4df3-9164-a62bb2333078" containerName="marketplace-operator" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.179843 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf7bbe6-5859-4df3-9164-a62bb2333078" containerName="marketplace-operator" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.179860 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36212ca9-755e-4104-a203-7c136afbfca9" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.179870 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="36212ca9-755e-4104-a203-7c136afbfca9" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.179883 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10e3708-a476-4698-aa8d-ba99a795524a" containerName="extract-utilities" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.179890 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10e3708-a476-4698-aa8d-ba99a795524a" containerName="extract-utilities" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.179900 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36212ca9-755e-4104-a203-7c136afbfca9" containerName="extract-utilities" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.179907 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="36212ca9-755e-4104-a203-7c136afbfca9" containerName="extract-utilities" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.179919 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805b56ac-66fd-4704-adb1-f3968f17f835" containerName="extract-content" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.179927 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="805b56ac-66fd-4704-adb1-f3968f17f835" containerName="extract-content" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.179938 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb71b40-ad9b-405b-a178-158109d65a92" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.179945 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb71b40-ad9b-405b-a178-158109d65a92" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.179953 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb71b40-ad9b-405b-a178-158109d65a92" containerName="extract-content" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.179961 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb71b40-ad9b-405b-a178-158109d65a92" containerName="extract-content" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.179971 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d003569d-8946-47e7-adf2-5148ca8de944" containerName="extract-content" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.179978 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d003569d-8946-47e7-adf2-5148ca8de944" containerName="extract-content" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.179988 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805b56ac-66fd-4704-adb1-f3968f17f835" containerName="extract-utilities" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.179995 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="805b56ac-66fd-4704-adb1-f3968f17f835" containerName="extract-utilities" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.180004 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d003569d-8946-47e7-adf2-5148ca8de944" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.180014 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d003569d-8946-47e7-adf2-5148ca8de944" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.180022 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb71b40-ad9b-405b-a178-158109d65a92" containerName="extract-utilities" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.180031 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb71b40-ad9b-405b-a178-158109d65a92" containerName="extract-utilities" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.180039 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805b56ac-66fd-4704-adb1-f3968f17f835" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.180045 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="805b56ac-66fd-4704-adb1-f3968f17f835" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.180056 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10e3708-a476-4698-aa8d-ba99a795524a" containerName="extract-content" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.180063 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10e3708-a476-4698-aa8d-ba99a795524a" containerName="extract-content" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.180072 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10e3708-a476-4698-aa8d-ba99a795524a" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.180079 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10e3708-a476-4698-aa8d-ba99a795524a" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: E0307 06:58:00.180094 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d003569d-8946-47e7-adf2-5148ca8de944" containerName="extract-utilities" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.180101 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d003569d-8946-47e7-adf2-5148ca8de944" containerName="extract-utilities" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.180209 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="a10e3708-a476-4698-aa8d-ba99a795524a" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.180222 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="805b56ac-66fd-4704-adb1-f3968f17f835" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.180230 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="d003569d-8946-47e7-adf2-5148ca8de944" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.180239 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdb71b40-ad9b-405b-a178-158109d65a92" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.180250 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf7bbe6-5859-4df3-9164-a62bb2333078" containerName="marketplace-operator" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.180260 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="36212ca9-755e-4104-a203-7c136afbfca9" containerName="registry-server" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.180740 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547778-x2xv2" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.184222 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.184276 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.187357 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.191399 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547778-x2xv2"] Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.310841 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvz6q\" (UniqueName: \"kubernetes.io/projected/d8a8efd6-1008-4ba6-9db3-afb764026fc3-kube-api-access-cvz6q\") pod \"auto-csr-approver-29547778-x2xv2\" (UID: \"d8a8efd6-1008-4ba6-9db3-afb764026fc3\") " pod="openshift-infra/auto-csr-approver-29547778-x2xv2" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.412170 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvz6q\" (UniqueName: \"kubernetes.io/projected/d8a8efd6-1008-4ba6-9db3-afb764026fc3-kube-api-access-cvz6q\") pod \"auto-csr-approver-29547778-x2xv2\" (UID: \"d8a8efd6-1008-4ba6-9db3-afb764026fc3\") " pod="openshift-infra/auto-csr-approver-29547778-x2xv2" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.441358 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvz6q\" (UniqueName: \"kubernetes.io/projected/d8a8efd6-1008-4ba6-9db3-afb764026fc3-kube-api-access-cvz6q\") pod \"auto-csr-approver-29547778-x2xv2\" (UID: \"d8a8efd6-1008-4ba6-9db3-afb764026fc3\") " pod="openshift-infra/auto-csr-approver-29547778-x2xv2" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.499719 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547778-x2xv2" Mar 07 06:58:00 crc kubenswrapper[4941]: I0307 06:58:00.995178 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547778-x2xv2"] Mar 07 06:58:01 crc kubenswrapper[4941]: I0307 06:58:01.579242 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547778-x2xv2" event={"ID":"d8a8efd6-1008-4ba6-9db3-afb764026fc3","Type":"ContainerStarted","Data":"bc0d18d91b7a2a358162431f32f33134afbcf1c27ed1e34305406691dea4abcd"} Mar 07 06:58:02 crc kubenswrapper[4941]: I0307 06:58:02.586058 4941 generic.go:334] "Generic (PLEG): container finished" podID="d8a8efd6-1008-4ba6-9db3-afb764026fc3" containerID="8ee64e9a2a790df233102d47f57d259db965b475e842ee99d53a89aa716bd946" exitCode=0 Mar 07 06:58:02 crc kubenswrapper[4941]: I0307 06:58:02.586177 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547778-x2xv2" event={"ID":"d8a8efd6-1008-4ba6-9db3-afb764026fc3","Type":"ContainerDied","Data":"8ee64e9a2a790df233102d47f57d259db965b475e842ee99d53a89aa716bd946"} Mar 07 06:58:03 crc kubenswrapper[4941]: I0307 06:58:03.961830 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547778-x2xv2" Mar 07 06:58:04 crc kubenswrapper[4941]: I0307 06:58:04.063759 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvz6q\" (UniqueName: \"kubernetes.io/projected/d8a8efd6-1008-4ba6-9db3-afb764026fc3-kube-api-access-cvz6q\") pod \"d8a8efd6-1008-4ba6-9db3-afb764026fc3\" (UID: \"d8a8efd6-1008-4ba6-9db3-afb764026fc3\") " Mar 07 06:58:04 crc kubenswrapper[4941]: I0307 06:58:04.070000 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a8efd6-1008-4ba6-9db3-afb764026fc3-kube-api-access-cvz6q" (OuterVolumeSpecName: "kube-api-access-cvz6q") pod "d8a8efd6-1008-4ba6-9db3-afb764026fc3" (UID: "d8a8efd6-1008-4ba6-9db3-afb764026fc3"). InnerVolumeSpecName "kube-api-access-cvz6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:58:04 crc kubenswrapper[4941]: I0307 06:58:04.165286 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvz6q\" (UniqueName: \"kubernetes.io/projected/d8a8efd6-1008-4ba6-9db3-afb764026fc3-kube-api-access-cvz6q\") on node \"crc\" DevicePath \"\"" Mar 07 06:58:04 crc kubenswrapper[4941]: I0307 06:58:04.601716 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547778-x2xv2" event={"ID":"d8a8efd6-1008-4ba6-9db3-afb764026fc3","Type":"ContainerDied","Data":"bc0d18d91b7a2a358162431f32f33134afbcf1c27ed1e34305406691dea4abcd"} Mar 07 06:58:04 crc kubenswrapper[4941]: I0307 06:58:04.601776 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc0d18d91b7a2a358162431f32f33134afbcf1c27ed1e34305406691dea4abcd" Mar 07 06:58:04 crc kubenswrapper[4941]: I0307 06:58:04.601785 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547778-x2xv2" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.176782 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bt48m"] Mar 07 06:58:15 crc kubenswrapper[4941]: E0307 06:58:15.177607 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a8efd6-1008-4ba6-9db3-afb764026fc3" containerName="oc" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.177620 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a8efd6-1008-4ba6-9db3-afb764026fc3" containerName="oc" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.177718 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a8efd6-1008-4ba6-9db3-afb764026fc3" containerName="oc" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.178468 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.182557 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.187912 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bt48m"] Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.225435 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p45k\" (UniqueName: \"kubernetes.io/projected/503fd081-26d9-469a-a201-aac508651d3e-kube-api-access-7p45k\") pod \"redhat-operators-bt48m\" (UID: \"503fd081-26d9-469a-a201-aac508651d3e\") " pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.225510 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/503fd081-26d9-469a-a201-aac508651d3e-catalog-content\") pod \"redhat-operators-bt48m\" (UID: \"503fd081-26d9-469a-a201-aac508651d3e\") " pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.225540 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/503fd081-26d9-469a-a201-aac508651d3e-utilities\") pod \"redhat-operators-bt48m\" (UID: \"503fd081-26d9-469a-a201-aac508651d3e\") " pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.326679 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/503fd081-26d9-469a-a201-aac508651d3e-catalog-content\") pod \"redhat-operators-bt48m\" (UID: \"503fd081-26d9-469a-a201-aac508651d3e\") " pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.326745 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/503fd081-26d9-469a-a201-aac508651d3e-utilities\") pod \"redhat-operators-bt48m\" (UID: \"503fd081-26d9-469a-a201-aac508651d3e\") " pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.326800 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p45k\" (UniqueName: \"kubernetes.io/projected/503fd081-26d9-469a-a201-aac508651d3e-kube-api-access-7p45k\") pod \"redhat-operators-bt48m\" (UID: \"503fd081-26d9-469a-a201-aac508651d3e\") " pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.327586 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/503fd081-26d9-469a-a201-aac508651d3e-utilities\") pod \"redhat-operators-bt48m\" (UID: \"503fd081-26d9-469a-a201-aac508651d3e\") " pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.327837 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/503fd081-26d9-469a-a201-aac508651d3e-catalog-content\") pod \"redhat-operators-bt48m\" (UID: \"503fd081-26d9-469a-a201-aac508651d3e\") " pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.352359 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p45k\" (UniqueName: \"kubernetes.io/projected/503fd081-26d9-469a-a201-aac508651d3e-kube-api-access-7p45k\") pod \"redhat-operators-bt48m\" (UID: \"503fd081-26d9-469a-a201-aac508651d3e\") " pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.370059 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v6vdl"] Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.371030 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.373552 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.421935 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v6vdl"] Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.428780 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwl9c\" (UniqueName: \"kubernetes.io/projected/94a21068-6164-47cd-ae7b-ec85cbed1247-kube-api-access-bwl9c\") pod \"certified-operators-v6vdl\" (UID: \"94a21068-6164-47cd-ae7b-ec85cbed1247\") " pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.428845 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a21068-6164-47cd-ae7b-ec85cbed1247-utilities\") pod \"certified-operators-v6vdl\" (UID: \"94a21068-6164-47cd-ae7b-ec85cbed1247\") " pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.428866 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a21068-6164-47cd-ae7b-ec85cbed1247-catalog-content\") pod \"certified-operators-v6vdl\" (UID: \"94a21068-6164-47cd-ae7b-ec85cbed1247\") " pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.497323 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.530504 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwl9c\" (UniqueName: \"kubernetes.io/projected/94a21068-6164-47cd-ae7b-ec85cbed1247-kube-api-access-bwl9c\") pod \"certified-operators-v6vdl\" (UID: \"94a21068-6164-47cd-ae7b-ec85cbed1247\") " pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.530652 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a21068-6164-47cd-ae7b-ec85cbed1247-utilities\") pod \"certified-operators-v6vdl\" (UID: \"94a21068-6164-47cd-ae7b-ec85cbed1247\") " pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.530701 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a21068-6164-47cd-ae7b-ec85cbed1247-catalog-content\") pod \"certified-operators-v6vdl\" (UID: \"94a21068-6164-47cd-ae7b-ec85cbed1247\") " pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.531284 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a21068-6164-47cd-ae7b-ec85cbed1247-catalog-content\") pod \"certified-operators-v6vdl\" (UID: \"94a21068-6164-47cd-ae7b-ec85cbed1247\") " pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.531742 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a21068-6164-47cd-ae7b-ec85cbed1247-utilities\") pod \"certified-operators-v6vdl\" (UID: \"94a21068-6164-47cd-ae7b-ec85cbed1247\") " pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.552665 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwl9c\" (UniqueName: \"kubernetes.io/projected/94a21068-6164-47cd-ae7b-ec85cbed1247-kube-api-access-bwl9c\") pod \"certified-operators-v6vdl\" (UID: \"94a21068-6164-47cd-ae7b-ec85cbed1247\") " pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.694778 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:15 crc kubenswrapper[4941]: I0307 06:58:15.931209 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bt48m"] Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.067637 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wn7cj"] Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.068847 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.082775 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wn7cj"] Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.112572 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v6vdl"] Mar 07 06:58:16 crc kubenswrapper[4941]: W0307 06:58:16.112953 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a21068_6164_47cd_ae7b_ec85cbed1247.slice/crio-96aacfb17b20c5849c521ac9431024c934988f7fb0d97843c49e535f203c09fb WatchSource:0}: Error finding container 96aacfb17b20c5849c521ac9431024c934988f7fb0d97843c49e535f203c09fb: Status 404 returned error can't find the container with id 96aacfb17b20c5849c521ac9431024c934988f7fb0d97843c49e535f203c09fb Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.138359 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abcf7611-6ded-4311-bcae-c7770dec27c4-trusted-ca\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.138430 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/abcf7611-6ded-4311-bcae-c7770dec27c4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.138462 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/abcf7611-6ded-4311-bcae-c7770dec27c4-registry-certificates\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.138491 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abcf7611-6ded-4311-bcae-c7770dec27c4-bound-sa-token\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.138511 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n72fw\" (UniqueName: \"kubernetes.io/projected/abcf7611-6ded-4311-bcae-c7770dec27c4-kube-api-access-n72fw\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.138546 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.138576 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abcf7611-6ded-4311-bcae-c7770dec27c4-registry-tls\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.138605 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/abcf7611-6ded-4311-bcae-c7770dec27c4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.168679 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.240011 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abcf7611-6ded-4311-bcae-c7770dec27c4-trusted-ca\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.240061 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/abcf7611-6ded-4311-bcae-c7770dec27c4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.240088 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/abcf7611-6ded-4311-bcae-c7770dec27c4-registry-certificates\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.240113 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abcf7611-6ded-4311-bcae-c7770dec27c4-bound-sa-token\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.240150 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n72fw\" (UniqueName: \"kubernetes.io/projected/abcf7611-6ded-4311-bcae-c7770dec27c4-kube-api-access-n72fw\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.240179 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abcf7611-6ded-4311-bcae-c7770dec27c4-registry-tls\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.240201 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/abcf7611-6ded-4311-bcae-c7770dec27c4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.242131 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abcf7611-6ded-4311-bcae-c7770dec27c4-trusted-ca\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.242146 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/abcf7611-6ded-4311-bcae-c7770dec27c4-registry-certificates\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.243377 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/abcf7611-6ded-4311-bcae-c7770dec27c4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.247314 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/abcf7611-6ded-4311-bcae-c7770dec27c4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.247807 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/abcf7611-6ded-4311-bcae-c7770dec27c4-registry-tls\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.256562 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abcf7611-6ded-4311-bcae-c7770dec27c4-bound-sa-token\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.259538 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n72fw\" (UniqueName: \"kubernetes.io/projected/abcf7611-6ded-4311-bcae-c7770dec27c4-kube-api-access-n72fw\") pod \"image-registry-66df7c8f76-wn7cj\" (UID: \"abcf7611-6ded-4311-bcae-c7770dec27c4\") " pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.385690 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.675037 4941 generic.go:334] "Generic (PLEG): container finished" podID="94a21068-6164-47cd-ae7b-ec85cbed1247" containerID="49688847618eb0def49f07151a211667d0f5029b6826d5a52735d000605de13e" exitCode=0 Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.675123 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vdl" event={"ID":"94a21068-6164-47cd-ae7b-ec85cbed1247","Type":"ContainerDied","Data":"49688847618eb0def49f07151a211667d0f5029b6826d5a52735d000605de13e"} Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.675516 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vdl" event={"ID":"94a21068-6164-47cd-ae7b-ec85cbed1247","Type":"ContainerStarted","Data":"96aacfb17b20c5849c521ac9431024c934988f7fb0d97843c49e535f203c09fb"} Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.676688 4941 generic.go:334] "Generic (PLEG): container finished" podID="503fd081-26d9-469a-a201-aac508651d3e" containerID="f69295188cf2f611cc84f24f8d81bbdb9dfe1bae8b7802e09f29e65e80944f56" exitCode=0 Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.676734 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bt48m" event={"ID":"503fd081-26d9-469a-a201-aac508651d3e","Type":"ContainerDied","Data":"f69295188cf2f611cc84f24f8d81bbdb9dfe1bae8b7802e09f29e65e80944f56"} Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.676764 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bt48m" event={"ID":"503fd081-26d9-469a-a201-aac508651d3e","Type":"ContainerStarted","Data":"64b9c5012362b6a3eff81bb7405d1e5423f1ec15e9edecfefd574d8dd3ebc345"} Mar 07 06:58:16 crc kubenswrapper[4941]: I0307 06:58:16.789454 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wn7cj"] Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.568391 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nst75"] Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.569832 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.572433 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.590867 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nst75"] Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.659455 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f666a7e-4fe4-44a4-8faa-25436dfd3753-utilities\") pod \"community-operators-nst75\" (UID: \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\") " pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.659622 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f666a7e-4fe4-44a4-8faa-25436dfd3753-catalog-content\") pod \"community-operators-nst75\" (UID: \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\") " pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.659670 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbxv4\" (UniqueName: \"kubernetes.io/projected/1f666a7e-4fe4-44a4-8faa-25436dfd3753-kube-api-access-gbxv4\") pod \"community-operators-nst75\" (UID: \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\") " pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.683359 4941 generic.go:334] "Generic (PLEG): container finished" podID="94a21068-6164-47cd-ae7b-ec85cbed1247" containerID="53e3498eed224ff31aea253325994f19716ce87b228e2d2db2c14bbb501b946a" exitCode=0 Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.683440 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vdl" event={"ID":"94a21068-6164-47cd-ae7b-ec85cbed1247","Type":"ContainerDied","Data":"53e3498eed224ff31aea253325994f19716ce87b228e2d2db2c14bbb501b946a"} Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.685050 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" event={"ID":"abcf7611-6ded-4311-bcae-c7770dec27c4","Type":"ContainerStarted","Data":"a28ec5d52948c7aa75fa037ae45a6305c468f0f2ffddcd1bb6f40572abd25e51"} Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.685124 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" event={"ID":"abcf7611-6ded-4311-bcae-c7770dec27c4","Type":"ContainerStarted","Data":"e0d0e507fceb22109ed4c0a026fae6e48e5b5dea43d75b68a68a68a996c1e102"} Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.685719 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.720342 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" podStartSLOduration=1.7203215649999999 podStartE2EDuration="1.720321565s" podCreationTimestamp="2026-03-07 06:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 06:58:17.718946489 +0000 UTC m=+394.671311954" watchObservedRunningTime="2026-03-07 06:58:17.720321565 +0000 UTC m=+394.672687060" Mar 07 06:58:17 crc kubenswrapper[4941]: E0307 06:58:17.731453 4941 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a21068_6164_47cd_ae7b_ec85cbed1247.slice/crio-53e3498eed224ff31aea253325994f19716ce87b228e2d2db2c14bbb501b946a.scope\": RecentStats: unable to find data in memory cache]" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.760684 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f666a7e-4fe4-44a4-8faa-25436dfd3753-catalog-content\") pod \"community-operators-nst75\" (UID: \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\") " pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.761303 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbxv4\" (UniqueName: \"kubernetes.io/projected/1f666a7e-4fe4-44a4-8faa-25436dfd3753-kube-api-access-gbxv4\") pod \"community-operators-nst75\" (UID: \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\") " pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.761473 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f666a7e-4fe4-44a4-8faa-25436dfd3753-utilities\") pod \"community-operators-nst75\" (UID: \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\") " pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.761595 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f666a7e-4fe4-44a4-8faa-25436dfd3753-catalog-content\") pod \"community-operators-nst75\" (UID: \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\") " pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.761958 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f666a7e-4fe4-44a4-8faa-25436dfd3753-utilities\") pod \"community-operators-nst75\" (UID: \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\") " pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.769041 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mghxn"] Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.770570 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.775592 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.782220 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mghxn"] Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.792816 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbxv4\" (UniqueName: \"kubernetes.io/projected/1f666a7e-4fe4-44a4-8faa-25436dfd3753-kube-api-access-gbxv4\") pod \"community-operators-nst75\" (UID: \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\") " pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.885585 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.965209 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-568zc\" (UniqueName: \"kubernetes.io/projected/1f4849d1-1617-465a-958a-1b31bb5de2df-kube-api-access-568zc\") pod \"redhat-marketplace-mghxn\" (UID: \"1f4849d1-1617-465a-958a-1b31bb5de2df\") " pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.965288 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f4849d1-1617-465a-958a-1b31bb5de2df-utilities\") pod \"redhat-marketplace-mghxn\" (UID: \"1f4849d1-1617-465a-958a-1b31bb5de2df\") " pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:17 crc kubenswrapper[4941]: I0307 06:58:17.965325 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f4849d1-1617-465a-958a-1b31bb5de2df-catalog-content\") pod \"redhat-marketplace-mghxn\" (UID: \"1f4849d1-1617-465a-958a-1b31bb5de2df\") " pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.066462 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-568zc\" (UniqueName: \"kubernetes.io/projected/1f4849d1-1617-465a-958a-1b31bb5de2df-kube-api-access-568zc\") pod \"redhat-marketplace-mghxn\" (UID: \"1f4849d1-1617-465a-958a-1b31bb5de2df\") " pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.066975 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f4849d1-1617-465a-958a-1b31bb5de2df-utilities\") pod \"redhat-marketplace-mghxn\" (UID: \"1f4849d1-1617-465a-958a-1b31bb5de2df\") " pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.067018 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f4849d1-1617-465a-958a-1b31bb5de2df-catalog-content\") pod \"redhat-marketplace-mghxn\" (UID: \"1f4849d1-1617-465a-958a-1b31bb5de2df\") " pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.067522 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f4849d1-1617-465a-958a-1b31bb5de2df-utilities\") pod \"redhat-marketplace-mghxn\" (UID: \"1f4849d1-1617-465a-958a-1b31bb5de2df\") " pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.067612 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f4849d1-1617-465a-958a-1b31bb5de2df-catalog-content\") pod \"redhat-marketplace-mghxn\" (UID: \"1f4849d1-1617-465a-958a-1b31bb5de2df\") " pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.069773 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nst75"] Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.083770 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-568zc\" (UniqueName: \"kubernetes.io/projected/1f4849d1-1617-465a-958a-1b31bb5de2df-kube-api-access-568zc\") pod \"redhat-marketplace-mghxn\" (UID: \"1f4849d1-1617-465a-958a-1b31bb5de2df\") " pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:18 crc kubenswrapper[4941]: W0307 06:58:18.084099 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f666a7e_4fe4_44a4_8faa_25436dfd3753.slice/crio-d8e8b972218ba1f5285fc751bb147ca789f22db3493987877f934ac910fcb33e WatchSource:0}: Error finding container d8e8b972218ba1f5285fc751bb147ca789f22db3493987877f934ac910fcb33e: Status 404 returned error can't find the container with id d8e8b972218ba1f5285fc751bb147ca789f22db3493987877f934ac910fcb33e Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.124796 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.519765 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mghxn"] Mar 07 06:58:18 crc kubenswrapper[4941]: W0307 06:58:18.528035 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f4849d1_1617_465a_958a_1b31bb5de2df.slice/crio-84710cbe3af5741a827e28bde83e531288d46f83b6159e2fdbfb0f108946d6bf WatchSource:0}: Error finding container 84710cbe3af5741a827e28bde83e531288d46f83b6159e2fdbfb0f108946d6bf: Status 404 returned error can't find the container with id 84710cbe3af5741a827e28bde83e531288d46f83b6159e2fdbfb0f108946d6bf Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.691864 4941 generic.go:334] "Generic (PLEG): container finished" podID="1f666a7e-4fe4-44a4-8faa-25436dfd3753" containerID="4124e3f5a055901b92a94ec2098c0f08d14635bfc57d16a43caae8e608189c7b" exitCode=0 Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.691955 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nst75" event={"ID":"1f666a7e-4fe4-44a4-8faa-25436dfd3753","Type":"ContainerDied","Data":"4124e3f5a055901b92a94ec2098c0f08d14635bfc57d16a43caae8e608189c7b"} Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.692503 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nst75" event={"ID":"1f666a7e-4fe4-44a4-8faa-25436dfd3753","Type":"ContainerStarted","Data":"d8e8b972218ba1f5285fc751bb147ca789f22db3493987877f934ac910fcb33e"} Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.696319 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bt48m" event={"ID":"503fd081-26d9-469a-a201-aac508651d3e","Type":"ContainerStarted","Data":"06dd2c40ad024c4e2b38005016d1f87f07da4e7ce1810665fe3a83221c9f7743"} Mar 07 06:58:18 crc kubenswrapper[4941]: I0307 06:58:18.697684 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mghxn" event={"ID":"1f4849d1-1617-465a-958a-1b31bb5de2df","Type":"ContainerStarted","Data":"84710cbe3af5741a827e28bde83e531288d46f83b6159e2fdbfb0f108946d6bf"} Mar 07 06:58:19 crc kubenswrapper[4941]: I0307 06:58:19.703930 4941 generic.go:334] "Generic (PLEG): container finished" podID="503fd081-26d9-469a-a201-aac508651d3e" containerID="06dd2c40ad024c4e2b38005016d1f87f07da4e7ce1810665fe3a83221c9f7743" exitCode=0 Mar 07 06:58:19 crc kubenswrapper[4941]: I0307 06:58:19.704046 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bt48m" event={"ID":"503fd081-26d9-469a-a201-aac508651d3e","Type":"ContainerDied","Data":"06dd2c40ad024c4e2b38005016d1f87f07da4e7ce1810665fe3a83221c9f7743"} Mar 07 06:58:19 crc kubenswrapper[4941]: I0307 06:58:19.707835 4941 generic.go:334] "Generic (PLEG): container finished" podID="1f4849d1-1617-465a-958a-1b31bb5de2df" containerID="5f5826c3c0e800608961c0dbf73331a1884ae5e957ad969fe9a8421abc130ab6" exitCode=0 Mar 07 06:58:19 crc kubenswrapper[4941]: I0307 06:58:19.707923 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mghxn" event={"ID":"1f4849d1-1617-465a-958a-1b31bb5de2df","Type":"ContainerDied","Data":"5f5826c3c0e800608961c0dbf73331a1884ae5e957ad969fe9a8421abc130ab6"} Mar 07 06:58:20 crc kubenswrapper[4941]: I0307 06:58:20.718206 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vdl" event={"ID":"94a21068-6164-47cd-ae7b-ec85cbed1247","Type":"ContainerStarted","Data":"62da5bea978d639f35b6b6382c6564742231bb9bee0bed65a7e60d57030fb2a7"} Mar 07 06:58:21 crc kubenswrapper[4941]: I0307 06:58:21.066036 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v6vdl" podStartSLOduration=3.244654117 podStartE2EDuration="6.066003245s" podCreationTimestamp="2026-03-07 06:58:15 +0000 UTC" firstStartedPulling="2026-03-07 06:58:16.679005555 +0000 UTC m=+393.631371020" lastFinishedPulling="2026-03-07 06:58:19.500354683 +0000 UTC m=+396.452720148" observedRunningTime="2026-03-07 06:58:21.058161338 +0000 UTC m=+398.010526803" watchObservedRunningTime="2026-03-07 06:58:21.066003245 +0000 UTC m=+398.018368760" Mar 07 06:58:21 crc kubenswrapper[4941]: I0307 06:58:21.732224 4941 generic.go:334] "Generic (PLEG): container finished" podID="1f666a7e-4fe4-44a4-8faa-25436dfd3753" containerID="cfbe5382a505fa5b203a29534fbca11abf2f5bcfeae42f0efb4f0625da36ca7d" exitCode=0 Mar 07 06:58:21 crc kubenswrapper[4941]: I0307 06:58:21.732634 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nst75" event={"ID":"1f666a7e-4fe4-44a4-8faa-25436dfd3753","Type":"ContainerDied","Data":"cfbe5382a505fa5b203a29534fbca11abf2f5bcfeae42f0efb4f0625da36ca7d"} Mar 07 06:58:21 crc kubenswrapper[4941]: I0307 06:58:21.747627 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bt48m" event={"ID":"503fd081-26d9-469a-a201-aac508651d3e","Type":"ContainerStarted","Data":"691b1ee2e6f9e31bb7dd8c96de64d6c7146a7005d684220c6063974007985284"} Mar 07 06:58:21 crc kubenswrapper[4941]: I0307 06:58:21.773451 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bt48m" podStartSLOduration=2.061343886 podStartE2EDuration="6.773432235s" podCreationTimestamp="2026-03-07 06:58:15 +0000 UTC" firstStartedPulling="2026-03-07 06:58:16.679385875 +0000 UTC m=+393.631751340" lastFinishedPulling="2026-03-07 06:58:21.391474224 +0000 UTC m=+398.343839689" observedRunningTime="2026-03-07 06:58:21.767638552 +0000 UTC m=+398.720004017" watchObservedRunningTime="2026-03-07 06:58:21.773432235 +0000 UTC m=+398.725797700" Mar 07 06:58:22 crc kubenswrapper[4941]: I0307 06:58:22.754854 4941 generic.go:334] "Generic (PLEG): container finished" podID="1f4849d1-1617-465a-958a-1b31bb5de2df" containerID="00bac6148b5d0ddb5f42091e1c4cb3355b9dc407524d7004a4e376e9327a6c3c" exitCode=0 Mar 07 06:58:22 crc kubenswrapper[4941]: I0307 06:58:22.754953 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mghxn" event={"ID":"1f4849d1-1617-465a-958a-1b31bb5de2df","Type":"ContainerDied","Data":"00bac6148b5d0ddb5f42091e1c4cb3355b9dc407524d7004a4e376e9327a6c3c"} Mar 07 06:58:22 crc kubenswrapper[4941]: I0307 06:58:22.763000 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nst75" event={"ID":"1f666a7e-4fe4-44a4-8faa-25436dfd3753","Type":"ContainerStarted","Data":"fc3a9c4579c2b26e078c9019735b718c51eb7637ba1f6a93ffc045b5ebcbd4ac"} Mar 07 06:58:22 crc kubenswrapper[4941]: I0307 06:58:22.795696 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nst75" podStartSLOduration=2.373523872 podStartE2EDuration="5.795674302s" podCreationTimestamp="2026-03-07 06:58:17 +0000 UTC" firstStartedPulling="2026-03-07 06:58:18.693429894 +0000 UTC m=+395.645795359" lastFinishedPulling="2026-03-07 06:58:22.115580314 +0000 UTC m=+399.067945789" observedRunningTime="2026-03-07 06:58:22.794676666 +0000 UTC m=+399.747042131" watchObservedRunningTime="2026-03-07 06:58:22.795674302 +0000 UTC m=+399.748039767" Mar 07 06:58:24 crc kubenswrapper[4941]: I0307 06:58:24.774869 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mghxn" event={"ID":"1f4849d1-1617-465a-958a-1b31bb5de2df","Type":"ContainerStarted","Data":"cf9f932e94e07e1d02460d82a52f8bf4d7739122e9d6044bd2932ab3464a80f1"} Mar 07 06:58:25 crc kubenswrapper[4941]: I0307 06:58:25.497743 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:25 crc kubenswrapper[4941]: I0307 06:58:25.498235 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:25 crc kubenswrapper[4941]: I0307 06:58:25.695055 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:25 crc kubenswrapper[4941]: I0307 06:58:25.695555 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:25 crc kubenswrapper[4941]: I0307 06:58:25.742280 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:25 crc kubenswrapper[4941]: I0307 06:58:25.761413 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mghxn" podStartSLOduration=4.246019061 podStartE2EDuration="8.761383613s" podCreationTimestamp="2026-03-07 06:58:17 +0000 UTC" firstStartedPulling="2026-03-07 06:58:19.709609831 +0000 UTC m=+396.661975296" lastFinishedPulling="2026-03-07 06:58:24.224974383 +0000 UTC m=+401.177339848" observedRunningTime="2026-03-07 06:58:24.797103898 +0000 UTC m=+401.749469363" watchObservedRunningTime="2026-03-07 06:58:25.761383613 +0000 UTC m=+402.713749078" Mar 07 06:58:25 crc kubenswrapper[4941]: I0307 06:58:25.816587 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v6vdl" Mar 07 06:58:26 crc kubenswrapper[4941]: I0307 06:58:26.539845 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bt48m" podUID="503fd081-26d9-469a-a201-aac508651d3e" containerName="registry-server" probeResult="failure" output=< Mar 07 06:58:26 crc kubenswrapper[4941]: timeout: failed to connect service ":50051" within 1s Mar 07 06:58:26 crc kubenswrapper[4941]: > Mar 07 06:58:27 crc kubenswrapper[4941]: I0307 06:58:27.887053 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:27 crc kubenswrapper[4941]: I0307 06:58:27.887105 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:27 crc kubenswrapper[4941]: I0307 06:58:27.950130 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:28 crc kubenswrapper[4941]: I0307 06:58:28.125053 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:28 crc kubenswrapper[4941]: I0307 06:58:28.125738 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:28 crc kubenswrapper[4941]: I0307 06:58:28.174930 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:28 crc kubenswrapper[4941]: I0307 06:58:28.851419 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nst75" Mar 07 06:58:29 crc kubenswrapper[4941]: I0307 06:58:29.847051 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mghxn" Mar 07 06:58:35 crc kubenswrapper[4941]: I0307 06:58:35.548769 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:35 crc kubenswrapper[4941]: I0307 06:58:35.602107 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bt48m" Mar 07 06:58:36 crc kubenswrapper[4941]: I0307 06:58:36.392767 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wn7cj" Mar 07 06:58:36 crc kubenswrapper[4941]: I0307 06:58:36.446082 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wpb7c"] Mar 07 06:58:40 crc kubenswrapper[4941]: I0307 06:58:40.314891 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 06:58:40 crc kubenswrapper[4941]: I0307 06:58:40.315509 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 06:59:01 crc kubenswrapper[4941]: I0307 06:59:01.494978 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" podUID="0cf41cff-9af9-423f-8e57-117983f90b7b" containerName="registry" containerID="cri-o://9186eac316e1b486a99840556c045eca05511c648142aab7f63e6c4c5893bf15" gracePeriod=30 Mar 07 06:59:01 crc kubenswrapper[4941]: I0307 06:59:01.906007 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.008288 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0cf41cff-9af9-423f-8e57-117983f90b7b-installation-pull-secrets\") pod \"0cf41cff-9af9-423f-8e57-117983f90b7b\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.008381 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-bound-sa-token\") pod \"0cf41cff-9af9-423f-8e57-117983f90b7b\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.008454 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvnml\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-kube-api-access-rvnml\") pod \"0cf41cff-9af9-423f-8e57-117983f90b7b\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.008519 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0cf41cff-9af9-423f-8e57-117983f90b7b-ca-trust-extracted\") pod \"0cf41cff-9af9-423f-8e57-117983f90b7b\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.008716 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"0cf41cff-9af9-423f-8e57-117983f90b7b\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.008769 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cf41cff-9af9-423f-8e57-117983f90b7b-trusted-ca\") pod \"0cf41cff-9af9-423f-8e57-117983f90b7b\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.008842 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-registry-tls\") pod \"0cf41cff-9af9-423f-8e57-117983f90b7b\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.008889 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0cf41cff-9af9-423f-8e57-117983f90b7b-registry-certificates\") pod \"0cf41cff-9af9-423f-8e57-117983f90b7b\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.010356 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cf41cff-9af9-423f-8e57-117983f90b7b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0cf41cff-9af9-423f-8e57-117983f90b7b" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.010602 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cf41cff-9af9-423f-8e57-117983f90b7b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0cf41cff-9af9-423f-8e57-117983f90b7b" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.015334 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf41cff-9af9-423f-8e57-117983f90b7b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0cf41cff-9af9-423f-8e57-117983f90b7b" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.016620 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-kube-api-access-rvnml" (OuterVolumeSpecName: "kube-api-access-rvnml") pod "0cf41cff-9af9-423f-8e57-117983f90b7b" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b"). InnerVolumeSpecName "kube-api-access-rvnml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.018314 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0cf41cff-9af9-423f-8e57-117983f90b7b" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.020030 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0cf41cff-9af9-423f-8e57-117983f90b7b" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 06:59:02 crc kubenswrapper[4941]: E0307 06:59:02.020957 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:0cf41cff-9af9-423f-8e57-117983f90b7b nodeName:}" failed. No retries permitted until 2026-03-07 06:59:02.520934251 +0000 UTC m=+439.473299716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "registry-storage" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "0cf41cff-9af9-423f-8e57-117983f90b7b" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.023903 4941 generic.go:334] "Generic (PLEG): container finished" podID="0cf41cff-9af9-423f-8e57-117983f90b7b" containerID="9186eac316e1b486a99840556c045eca05511c648142aab7f63e6c4c5893bf15" exitCode=0 Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.024113 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.023978 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" event={"ID":"0cf41cff-9af9-423f-8e57-117983f90b7b","Type":"ContainerDied","Data":"9186eac316e1b486a99840556c045eca05511c648142aab7f63e6c4c5893bf15"} Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.024268 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wpb7c" event={"ID":"0cf41cff-9af9-423f-8e57-117983f90b7b","Type":"ContainerDied","Data":"e76fda133319d6f4fd1adb769f781001de73e7fc3eb6e6a28a9c64bf8c785c65"} Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.024293 4941 scope.go:117] "RemoveContainer" containerID="9186eac316e1b486a99840556c045eca05511c648142aab7f63e6c4c5893bf15" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.028862 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cf41cff-9af9-423f-8e57-117983f90b7b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0cf41cff-9af9-423f-8e57-117983f90b7b" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.084015 4941 scope.go:117] "RemoveContainer" containerID="9186eac316e1b486a99840556c045eca05511c648142aab7f63e6c4c5893bf15" Mar 07 06:59:02 crc kubenswrapper[4941]: E0307 06:59:02.084776 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9186eac316e1b486a99840556c045eca05511c648142aab7f63e6c4c5893bf15\": container with ID starting with 9186eac316e1b486a99840556c045eca05511c648142aab7f63e6c4c5893bf15 not found: ID does not exist" containerID="9186eac316e1b486a99840556c045eca05511c648142aab7f63e6c4c5893bf15" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.084831 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9186eac316e1b486a99840556c045eca05511c648142aab7f63e6c4c5893bf15"} err="failed to get container status \"9186eac316e1b486a99840556c045eca05511c648142aab7f63e6c4c5893bf15\": rpc error: code = NotFound desc = could not find container \"9186eac316e1b486a99840556c045eca05511c648142aab7f63e6c4c5893bf15\": container with ID starting with 9186eac316e1b486a99840556c045eca05511c648142aab7f63e6c4c5893bf15 not found: ID does not exist" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.110158 4941 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.110198 4941 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0cf41cff-9af9-423f-8e57-117983f90b7b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.110214 4941 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0cf41cff-9af9-423f-8e57-117983f90b7b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.110225 4941 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.110237 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvnml\" (UniqueName: \"kubernetes.io/projected/0cf41cff-9af9-423f-8e57-117983f90b7b-kube-api-access-rvnml\") on node \"crc\" DevicePath \"\"" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.110248 4941 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0cf41cff-9af9-423f-8e57-117983f90b7b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.110259 4941 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cf41cff-9af9-423f-8e57-117983f90b7b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.618470 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"0cf41cff-9af9-423f-8e57-117983f90b7b\" (UID: \"0cf41cff-9af9-423f-8e57-117983f90b7b\") " Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.637392 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "0cf41cff-9af9-423f-8e57-117983f90b7b" (UID: "0cf41cff-9af9-423f-8e57-117983f90b7b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.774120 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wpb7c"] Mar 07 06:59:02 crc kubenswrapper[4941]: I0307 06:59:02.782180 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wpb7c"] Mar 07 06:59:03 crc kubenswrapper[4941]: I0307 06:59:03.963222 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cf41cff-9af9-423f-8e57-117983f90b7b" path="/var/lib/kubelet/pods/0cf41cff-9af9-423f-8e57-117983f90b7b/volumes" Mar 07 06:59:10 crc kubenswrapper[4941]: I0307 06:59:10.313759 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 06:59:10 crc kubenswrapper[4941]: I0307 06:59:10.314172 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 06:59:40 crc kubenswrapper[4941]: I0307 06:59:40.314510 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 06:59:40 crc kubenswrapper[4941]: I0307 06:59:40.315148 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 06:59:40 crc kubenswrapper[4941]: I0307 06:59:40.315210 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 06:59:40 crc kubenswrapper[4941]: I0307 06:59:40.315936 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7416552df2e4055270a338bb841d809844616b740ce91a60c1b321d6a4dac058"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 06:59:40 crc kubenswrapper[4941]: I0307 06:59:40.316008 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://7416552df2e4055270a338bb841d809844616b740ce91a60c1b321d6a4dac058" gracePeriod=600 Mar 07 06:59:41 crc kubenswrapper[4941]: I0307 06:59:41.304717 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="7416552df2e4055270a338bb841d809844616b740ce91a60c1b321d6a4dac058" exitCode=0 Mar 07 06:59:41 crc kubenswrapper[4941]: I0307 06:59:41.304824 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"7416552df2e4055270a338bb841d809844616b740ce91a60c1b321d6a4dac058"} Mar 07 06:59:41 crc kubenswrapper[4941]: I0307 06:59:41.305317 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"d746908cc3303b1fdbe53fad12169c53fe203d0fd3b7bd7e783a1191a81869a4"} Mar 07 06:59:41 crc kubenswrapper[4941]: I0307 06:59:41.305351 4941 scope.go:117] "RemoveContainer" containerID="fff30659866101ada5a60d6fb392c2f0f4c883449f13f36b6e7f3d0b46b2f47a" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.140951 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547780-42t8n"] Mar 07 07:00:00 crc kubenswrapper[4941]: E0307 07:00:00.142106 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf41cff-9af9-423f-8e57-117983f90b7b" containerName="registry" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.142124 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf41cff-9af9-423f-8e57-117983f90b7b" containerName="registry" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.142255 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cf41cff-9af9-423f-8e57-117983f90b7b" containerName="registry" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.142807 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547780-42t8n" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.143825 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss"] Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.144509 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.148794 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.149341 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.150150 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.150186 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.150468 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.152905 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss"] Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.156865 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547780-42t8n"] Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.331324 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd63482d-26a4-4246-8dbc-bad3c19aff6a-config-volume\") pod \"collect-profiles-29547780-nlhss\" (UID: \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.331393 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd63482d-26a4-4246-8dbc-bad3c19aff6a-secret-volume\") pod \"collect-profiles-29547780-nlhss\" (UID: \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.331457 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk5ww\" (UniqueName: \"kubernetes.io/projected/da69c66c-a6c7-439f-a328-6514f986d16b-kube-api-access-lk5ww\") pod \"auto-csr-approver-29547780-42t8n\" (UID: \"da69c66c-a6c7-439f-a328-6514f986d16b\") " pod="openshift-infra/auto-csr-approver-29547780-42t8n" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.331720 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj72j\" (UniqueName: \"kubernetes.io/projected/fd63482d-26a4-4246-8dbc-bad3c19aff6a-kube-api-access-gj72j\") pod \"collect-profiles-29547780-nlhss\" (UID: \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.432765 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj72j\" (UniqueName: \"kubernetes.io/projected/fd63482d-26a4-4246-8dbc-bad3c19aff6a-kube-api-access-gj72j\") pod \"collect-profiles-29547780-nlhss\" (UID: \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.432934 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd63482d-26a4-4246-8dbc-bad3c19aff6a-config-volume\") pod \"collect-profiles-29547780-nlhss\" (UID: \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.433008 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd63482d-26a4-4246-8dbc-bad3c19aff6a-secret-volume\") pod \"collect-profiles-29547780-nlhss\" (UID: \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.433103 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk5ww\" (UniqueName: \"kubernetes.io/projected/da69c66c-a6c7-439f-a328-6514f986d16b-kube-api-access-lk5ww\") pod \"auto-csr-approver-29547780-42t8n\" (UID: \"da69c66c-a6c7-439f-a328-6514f986d16b\") " pod="openshift-infra/auto-csr-approver-29547780-42t8n" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.433981 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd63482d-26a4-4246-8dbc-bad3c19aff6a-config-volume\") pod \"collect-profiles-29547780-nlhss\" (UID: \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.441616 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd63482d-26a4-4246-8dbc-bad3c19aff6a-secret-volume\") pod \"collect-profiles-29547780-nlhss\" (UID: \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.455932 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk5ww\" (UniqueName: \"kubernetes.io/projected/da69c66c-a6c7-439f-a328-6514f986d16b-kube-api-access-lk5ww\") pod \"auto-csr-approver-29547780-42t8n\" (UID: \"da69c66c-a6c7-439f-a328-6514f986d16b\") " pod="openshift-infra/auto-csr-approver-29547780-42t8n" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.459345 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj72j\" (UniqueName: \"kubernetes.io/projected/fd63482d-26a4-4246-8dbc-bad3c19aff6a-kube-api-access-gj72j\") pod \"collect-profiles-29547780-nlhss\" (UID: \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.491194 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547780-42t8n" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.512734 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.702526 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547780-42t8n"] Mar 07 07:00:00 crc kubenswrapper[4941]: I0307 07:00:00.764991 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss"] Mar 07 07:00:00 crc kubenswrapper[4941]: W0307 07:00:00.770159 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd63482d_26a4_4246_8dbc_bad3c19aff6a.slice/crio-f5b2458f76a49cdd0be781060a8ce2bbf4b2ac25dc362e672369cbf25b817efd WatchSource:0}: Error finding container f5b2458f76a49cdd0be781060a8ce2bbf4b2ac25dc362e672369cbf25b817efd: Status 404 returned error can't find the container with id f5b2458f76a49cdd0be781060a8ce2bbf4b2ac25dc362e672369cbf25b817efd Mar 07 07:00:01 crc kubenswrapper[4941]: I0307 07:00:01.444326 4941 generic.go:334] "Generic (PLEG): container finished" podID="fd63482d-26a4-4246-8dbc-bad3c19aff6a" containerID="d1544d2a6907322c8e6755a159ebb752f8b1c3cbea675954e6310f3b63370818" exitCode=0 Mar 07 07:00:01 crc kubenswrapper[4941]: I0307 07:00:01.444489 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" event={"ID":"fd63482d-26a4-4246-8dbc-bad3c19aff6a","Type":"ContainerDied","Data":"d1544d2a6907322c8e6755a159ebb752f8b1c3cbea675954e6310f3b63370818"} Mar 07 07:00:01 crc kubenswrapper[4941]: I0307 07:00:01.445009 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" event={"ID":"fd63482d-26a4-4246-8dbc-bad3c19aff6a","Type":"ContainerStarted","Data":"f5b2458f76a49cdd0be781060a8ce2bbf4b2ac25dc362e672369cbf25b817efd"} Mar 07 07:00:01 crc kubenswrapper[4941]: I0307 07:00:01.446742 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547780-42t8n" event={"ID":"da69c66c-a6c7-439f-a328-6514f986d16b","Type":"ContainerStarted","Data":"48f891f6efe4f3ba7e31a76cdb5b5b4acac87e53e6728d7d83ceecd1676bf5c7"} Mar 07 07:00:02 crc kubenswrapper[4941]: I0307 07:00:02.766759 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" Mar 07 07:00:02 crc kubenswrapper[4941]: I0307 07:00:02.866171 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd63482d-26a4-4246-8dbc-bad3c19aff6a-config-volume\") pod \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\" (UID: \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\") " Mar 07 07:00:02 crc kubenswrapper[4941]: I0307 07:00:02.866331 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd63482d-26a4-4246-8dbc-bad3c19aff6a-secret-volume\") pod \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\" (UID: \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\") " Mar 07 07:00:02 crc kubenswrapper[4941]: I0307 07:00:02.866440 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj72j\" (UniqueName: \"kubernetes.io/projected/fd63482d-26a4-4246-8dbc-bad3c19aff6a-kube-api-access-gj72j\") pod \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\" (UID: \"fd63482d-26a4-4246-8dbc-bad3c19aff6a\") " Mar 07 07:00:02 crc kubenswrapper[4941]: I0307 07:00:02.867556 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd63482d-26a4-4246-8dbc-bad3c19aff6a-config-volume" (OuterVolumeSpecName: "config-volume") pod "fd63482d-26a4-4246-8dbc-bad3c19aff6a" (UID: "fd63482d-26a4-4246-8dbc-bad3c19aff6a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:00:02 crc kubenswrapper[4941]: I0307 07:00:02.875616 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd63482d-26a4-4246-8dbc-bad3c19aff6a-kube-api-access-gj72j" (OuterVolumeSpecName: "kube-api-access-gj72j") pod "fd63482d-26a4-4246-8dbc-bad3c19aff6a" (UID: "fd63482d-26a4-4246-8dbc-bad3c19aff6a"). InnerVolumeSpecName "kube-api-access-gj72j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:00:02 crc kubenswrapper[4941]: I0307 07:00:02.875764 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd63482d-26a4-4246-8dbc-bad3c19aff6a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fd63482d-26a4-4246-8dbc-bad3c19aff6a" (UID: "fd63482d-26a4-4246-8dbc-bad3c19aff6a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:00:02 crc kubenswrapper[4941]: I0307 07:00:02.968536 4941 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd63482d-26a4-4246-8dbc-bad3c19aff6a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:00:02 crc kubenswrapper[4941]: I0307 07:00:02.968602 4941 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd63482d-26a4-4246-8dbc-bad3c19aff6a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:00:02 crc kubenswrapper[4941]: I0307 07:00:02.968623 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj72j\" (UniqueName: \"kubernetes.io/projected/fd63482d-26a4-4246-8dbc-bad3c19aff6a-kube-api-access-gj72j\") on node \"crc\" DevicePath \"\"" Mar 07 07:00:03 crc kubenswrapper[4941]: I0307 07:00:03.462627 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" event={"ID":"fd63482d-26a4-4246-8dbc-bad3c19aff6a","Type":"ContainerDied","Data":"f5b2458f76a49cdd0be781060a8ce2bbf4b2ac25dc362e672369cbf25b817efd"} Mar 07 07:00:03 crc kubenswrapper[4941]: I0307 07:00:03.462668 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5b2458f76a49cdd0be781060a8ce2bbf4b2ac25dc362e672369cbf25b817efd" Mar 07 07:00:03 crc kubenswrapper[4941]: I0307 07:00:03.462730 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss" Mar 07 07:00:14 crc kubenswrapper[4941]: I0307 07:00:14.539889 4941 generic.go:334] "Generic (PLEG): container finished" podID="da69c66c-a6c7-439f-a328-6514f986d16b" containerID="d3ee4a945633e6b0a3254728184548cc71eea0e37bf7eb74506c8b423cbb8379" exitCode=0 Mar 07 07:00:14 crc kubenswrapper[4941]: I0307 07:00:14.540000 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547780-42t8n" event={"ID":"da69c66c-a6c7-439f-a328-6514f986d16b","Type":"ContainerDied","Data":"d3ee4a945633e6b0a3254728184548cc71eea0e37bf7eb74506c8b423cbb8379"} Mar 07 07:00:15 crc kubenswrapper[4941]: I0307 07:00:15.796094 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547780-42t8n" Mar 07 07:00:15 crc kubenswrapper[4941]: I0307 07:00:15.954404 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk5ww\" (UniqueName: \"kubernetes.io/projected/da69c66c-a6c7-439f-a328-6514f986d16b-kube-api-access-lk5ww\") pod \"da69c66c-a6c7-439f-a328-6514f986d16b\" (UID: \"da69c66c-a6c7-439f-a328-6514f986d16b\") " Mar 07 07:00:15 crc kubenswrapper[4941]: I0307 07:00:15.962903 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da69c66c-a6c7-439f-a328-6514f986d16b-kube-api-access-lk5ww" (OuterVolumeSpecName: "kube-api-access-lk5ww") pod "da69c66c-a6c7-439f-a328-6514f986d16b" (UID: "da69c66c-a6c7-439f-a328-6514f986d16b"). InnerVolumeSpecName "kube-api-access-lk5ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:00:16 crc kubenswrapper[4941]: I0307 07:00:16.056017 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk5ww\" (UniqueName: \"kubernetes.io/projected/da69c66c-a6c7-439f-a328-6514f986d16b-kube-api-access-lk5ww\") on node \"crc\" DevicePath \"\"" Mar 07 07:00:16 crc kubenswrapper[4941]: I0307 07:00:16.559068 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547780-42t8n" event={"ID":"da69c66c-a6c7-439f-a328-6514f986d16b","Type":"ContainerDied","Data":"48f891f6efe4f3ba7e31a76cdb5b5b4acac87e53e6728d7d83ceecd1676bf5c7"} Mar 07 07:00:16 crc kubenswrapper[4941]: I0307 07:00:16.559120 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48f891f6efe4f3ba7e31a76cdb5b5b4acac87e53e6728d7d83ceecd1676bf5c7" Mar 07 07:00:16 crc kubenswrapper[4941]: I0307 07:00:16.559183 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547780-42t8n" Mar 07 07:00:16 crc kubenswrapper[4941]: I0307 07:00:16.863725 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547774-2b2fh"] Mar 07 07:00:16 crc kubenswrapper[4941]: I0307 07:00:16.868568 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547774-2b2fh"] Mar 07 07:00:17 crc kubenswrapper[4941]: I0307 07:00:17.960481 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7566a1ff-1f6b-4e99-903b-ff036f98c411" path="/var/lib/kubelet/pods/7566a1ff-1f6b-4e99-903b-ff036f98c411/volumes" Mar 07 07:01:40 crc kubenswrapper[4941]: I0307 07:01:40.314099 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:01:40 crc kubenswrapper[4941]: I0307 07:01:40.316717 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.148657 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547782-kqxht"] Mar 07 07:02:00 crc kubenswrapper[4941]: E0307 07:02:00.150142 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd63482d-26a4-4246-8dbc-bad3c19aff6a" containerName="collect-profiles" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.150172 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd63482d-26a4-4246-8dbc-bad3c19aff6a" containerName="collect-profiles" Mar 07 07:02:00 crc kubenswrapper[4941]: E0307 07:02:00.150217 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da69c66c-a6c7-439f-a328-6514f986d16b" containerName="oc" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.150231 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="da69c66c-a6c7-439f-a328-6514f986d16b" containerName="oc" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.150469 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd63482d-26a4-4246-8dbc-bad3c19aff6a" containerName="collect-profiles" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.150506 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="da69c66c-a6c7-439f-a328-6514f986d16b" containerName="oc" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.151166 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547782-kqxht" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.153532 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547782-kqxht"] Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.154989 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.156865 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.157334 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.335135 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzp4c\" (UniqueName: \"kubernetes.io/projected/8ce6a206-b2ea-41bd-84f7-c7f1007b321c-kube-api-access-pzp4c\") pod \"auto-csr-approver-29547782-kqxht\" (UID: \"8ce6a206-b2ea-41bd-84f7-c7f1007b321c\") " pod="openshift-infra/auto-csr-approver-29547782-kqxht" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.436364 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzp4c\" (UniqueName: \"kubernetes.io/projected/8ce6a206-b2ea-41bd-84f7-c7f1007b321c-kube-api-access-pzp4c\") pod \"auto-csr-approver-29547782-kqxht\" (UID: \"8ce6a206-b2ea-41bd-84f7-c7f1007b321c\") " pod="openshift-infra/auto-csr-approver-29547782-kqxht" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.462123 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzp4c\" (UniqueName: \"kubernetes.io/projected/8ce6a206-b2ea-41bd-84f7-c7f1007b321c-kube-api-access-pzp4c\") pod \"auto-csr-approver-29547782-kqxht\" (UID: \"8ce6a206-b2ea-41bd-84f7-c7f1007b321c\") " pod="openshift-infra/auto-csr-approver-29547782-kqxht" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.482514 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547782-kqxht" Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.730794 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547782-kqxht"] Mar 07 07:02:00 crc kubenswrapper[4941]: I0307 07:02:00.745079 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:02:01 crc kubenswrapper[4941]: I0307 07:02:01.675624 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547782-kqxht" event={"ID":"8ce6a206-b2ea-41bd-84f7-c7f1007b321c","Type":"ContainerStarted","Data":"4c0c5aab9ec650c31e2b49ad569ba686f35c45654287b1ba21a2c2669dd44204"} Mar 07 07:02:02 crc kubenswrapper[4941]: I0307 07:02:02.172154 4941 scope.go:117] "RemoveContainer" containerID="9c04ce9ef1572b01e35c46419ecc2059217c92a09940fe2cae1a259cb41f8ccc" Mar 07 07:02:02 crc kubenswrapper[4941]: I0307 07:02:02.686527 4941 generic.go:334] "Generic (PLEG): container finished" podID="8ce6a206-b2ea-41bd-84f7-c7f1007b321c" containerID="fec4a1c5879c5bc7d04268a2f315e08c10492170ef27522b5f50cb4ab4b9a1dd" exitCode=0 Mar 07 07:02:02 crc kubenswrapper[4941]: I0307 07:02:02.686573 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547782-kqxht" event={"ID":"8ce6a206-b2ea-41bd-84f7-c7f1007b321c","Type":"ContainerDied","Data":"fec4a1c5879c5bc7d04268a2f315e08c10492170ef27522b5f50cb4ab4b9a1dd"} Mar 07 07:02:03 crc kubenswrapper[4941]: I0307 07:02:03.945596 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547782-kqxht" Mar 07 07:02:03 crc kubenswrapper[4941]: I0307 07:02:03.986492 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzp4c\" (UniqueName: \"kubernetes.io/projected/8ce6a206-b2ea-41bd-84f7-c7f1007b321c-kube-api-access-pzp4c\") pod \"8ce6a206-b2ea-41bd-84f7-c7f1007b321c\" (UID: \"8ce6a206-b2ea-41bd-84f7-c7f1007b321c\") " Mar 07 07:02:03 crc kubenswrapper[4941]: I0307 07:02:03.997872 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ce6a206-b2ea-41bd-84f7-c7f1007b321c-kube-api-access-pzp4c" (OuterVolumeSpecName: "kube-api-access-pzp4c") pod "8ce6a206-b2ea-41bd-84f7-c7f1007b321c" (UID: "8ce6a206-b2ea-41bd-84f7-c7f1007b321c"). InnerVolumeSpecName "kube-api-access-pzp4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:02:04 crc kubenswrapper[4941]: I0307 07:02:04.088822 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzp4c\" (UniqueName: \"kubernetes.io/projected/8ce6a206-b2ea-41bd-84f7-c7f1007b321c-kube-api-access-pzp4c\") on node \"crc\" DevicePath \"\"" Mar 07 07:02:04 crc kubenswrapper[4941]: I0307 07:02:04.703571 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547782-kqxht" event={"ID":"8ce6a206-b2ea-41bd-84f7-c7f1007b321c","Type":"ContainerDied","Data":"4c0c5aab9ec650c31e2b49ad569ba686f35c45654287b1ba21a2c2669dd44204"} Mar 07 07:02:04 crc kubenswrapper[4941]: I0307 07:02:04.703629 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c0c5aab9ec650c31e2b49ad569ba686f35c45654287b1ba21a2c2669dd44204" Mar 07 07:02:04 crc kubenswrapper[4941]: I0307 07:02:04.703630 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547782-kqxht" Mar 07 07:02:05 crc kubenswrapper[4941]: I0307 07:02:05.012189 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547776-g4ksl"] Mar 07 07:02:05 crc kubenswrapper[4941]: I0307 07:02:05.015389 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547776-g4ksl"] Mar 07 07:02:05 crc kubenswrapper[4941]: I0307 07:02:05.962734 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3fa14c7-e4f9-42fc-8972-8d18263ee801" path="/var/lib/kubelet/pods/e3fa14c7-e4f9-42fc-8972-8d18263ee801/volumes" Mar 07 07:02:10 crc kubenswrapper[4941]: I0307 07:02:10.313961 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:02:10 crc kubenswrapper[4941]: I0307 07:02:10.314461 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:02:40 crc kubenswrapper[4941]: I0307 07:02:40.314496 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:02:40 crc kubenswrapper[4941]: I0307 07:02:40.315593 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:02:40 crc kubenswrapper[4941]: I0307 07:02:40.315654 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 07:02:40 crc kubenswrapper[4941]: I0307 07:02:40.316395 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d746908cc3303b1fdbe53fad12169c53fe203d0fd3b7bd7e783a1191a81869a4"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:02:40 crc kubenswrapper[4941]: I0307 07:02:40.316508 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://d746908cc3303b1fdbe53fad12169c53fe203d0fd3b7bd7e783a1191a81869a4" gracePeriod=600 Mar 07 07:02:40 crc kubenswrapper[4941]: I0307 07:02:40.949136 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="d746908cc3303b1fdbe53fad12169c53fe203d0fd3b7bd7e783a1191a81869a4" exitCode=0 Mar 07 07:02:40 crc kubenswrapper[4941]: I0307 07:02:40.949198 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"d746908cc3303b1fdbe53fad12169c53fe203d0fd3b7bd7e783a1191a81869a4"} Mar 07 07:02:40 crc kubenswrapper[4941]: I0307 07:02:40.949611 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"b7fe83ca68d83b7b6bf2fa18e88a311d0a293429f704eef511a481cd353a9e5a"} Mar 07 07:02:40 crc kubenswrapper[4941]: I0307 07:02:40.949633 4941 scope.go:117] "RemoveContainer" containerID="7416552df2e4055270a338bb841d809844616b740ce91a60c1b321d6a4dac058" Mar 07 07:03:02 crc kubenswrapper[4941]: I0307 07:03:02.211942 4941 scope.go:117] "RemoveContainer" containerID="d9c269d4a8b1f56ee0402d644bcbe0748bf9b3ad684452bbd07a64c7dd3e9c57" Mar 07 07:03:02 crc kubenswrapper[4941]: I0307 07:03:02.245942 4941 scope.go:117] "RemoveContainer" containerID="7bf7f4695cd43c0565ac5f6b74ad1e3889ca846037a49295da8a36a319c304e1" Mar 07 07:03:02 crc kubenswrapper[4941]: I0307 07:03:02.271178 4941 scope.go:117] "RemoveContainer" containerID="34d2d8c8f7224ea5b19fb4cc349409b66394df63edeaec259d4dc4449b46fb16" Mar 07 07:03:02 crc kubenswrapper[4941]: I0307 07:03:02.312286 4941 scope.go:117] "RemoveContainer" containerID="e1640f8173deaee95910a78d9af2192ee7718b4419a38bd0a2618c121759c01f" Mar 07 07:04:00 crc kubenswrapper[4941]: I0307 07:04:00.152993 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547784-7g5m7"] Mar 07 07:04:00 crc kubenswrapper[4941]: E0307 07:04:00.154455 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ce6a206-b2ea-41bd-84f7-c7f1007b321c" containerName="oc" Mar 07 07:04:00 crc kubenswrapper[4941]: I0307 07:04:00.154491 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ce6a206-b2ea-41bd-84f7-c7f1007b321c" containerName="oc" Mar 07 07:04:00 crc kubenswrapper[4941]: I0307 07:04:00.154705 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ce6a206-b2ea-41bd-84f7-c7f1007b321c" containerName="oc" Mar 07 07:04:00 crc kubenswrapper[4941]: I0307 07:04:00.155582 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547784-7g5m7" Mar 07 07:04:00 crc kubenswrapper[4941]: I0307 07:04:00.159338 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:04:00 crc kubenswrapper[4941]: I0307 07:04:00.159636 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:04:00 crc kubenswrapper[4941]: I0307 07:04:00.159656 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:04:00 crc kubenswrapper[4941]: I0307 07:04:00.162461 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547784-7g5m7"] Mar 07 07:04:00 crc kubenswrapper[4941]: I0307 07:04:00.324433 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgpdj\" (UniqueName: \"kubernetes.io/projected/89de2b71-4ba3-4287-8689-52d30ddea0bd-kube-api-access-jgpdj\") pod \"auto-csr-approver-29547784-7g5m7\" (UID: \"89de2b71-4ba3-4287-8689-52d30ddea0bd\") " pod="openshift-infra/auto-csr-approver-29547784-7g5m7" Mar 07 07:04:00 crc kubenswrapper[4941]: I0307 07:04:00.425793 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgpdj\" (UniqueName: \"kubernetes.io/projected/89de2b71-4ba3-4287-8689-52d30ddea0bd-kube-api-access-jgpdj\") pod \"auto-csr-approver-29547784-7g5m7\" (UID: \"89de2b71-4ba3-4287-8689-52d30ddea0bd\") " pod="openshift-infra/auto-csr-approver-29547784-7g5m7" Mar 07 07:04:00 crc kubenswrapper[4941]: I0307 07:04:00.451422 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgpdj\" (UniqueName: \"kubernetes.io/projected/89de2b71-4ba3-4287-8689-52d30ddea0bd-kube-api-access-jgpdj\") pod \"auto-csr-approver-29547784-7g5m7\" (UID: \"89de2b71-4ba3-4287-8689-52d30ddea0bd\") " pod="openshift-infra/auto-csr-approver-29547784-7g5m7" Mar 07 07:04:00 crc kubenswrapper[4941]: I0307 07:04:00.487930 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547784-7g5m7" Mar 07 07:04:00 crc kubenswrapper[4941]: I0307 07:04:00.657121 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547784-7g5m7"] Mar 07 07:04:01 crc kubenswrapper[4941]: I0307 07:04:01.495920 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547784-7g5m7" event={"ID":"89de2b71-4ba3-4287-8689-52d30ddea0bd","Type":"ContainerStarted","Data":"b6fbac2feffe99e79cf1abe9b9bafd39512c3b15ca1a2415ace1fb654ccafb96"} Mar 07 07:04:02 crc kubenswrapper[4941]: I0307 07:04:02.504542 4941 generic.go:334] "Generic (PLEG): container finished" podID="89de2b71-4ba3-4287-8689-52d30ddea0bd" containerID="6ed1b923f2ae62ed2ce73a78f3fb6f447a9d2f292591ac196920f850bba10c79" exitCode=0 Mar 07 07:04:02 crc kubenswrapper[4941]: I0307 07:04:02.504674 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547784-7g5m7" event={"ID":"89de2b71-4ba3-4287-8689-52d30ddea0bd","Type":"ContainerDied","Data":"6ed1b923f2ae62ed2ce73a78f3fb6f447a9d2f292591ac196920f850bba10c79"} Mar 07 07:04:03 crc kubenswrapper[4941]: I0307 07:04:03.806535 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547784-7g5m7" Mar 07 07:04:03 crc kubenswrapper[4941]: I0307 07:04:03.986398 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgpdj\" (UniqueName: \"kubernetes.io/projected/89de2b71-4ba3-4287-8689-52d30ddea0bd-kube-api-access-jgpdj\") pod \"89de2b71-4ba3-4287-8689-52d30ddea0bd\" (UID: \"89de2b71-4ba3-4287-8689-52d30ddea0bd\") " Mar 07 07:04:03 crc kubenswrapper[4941]: I0307 07:04:03.996692 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89de2b71-4ba3-4287-8689-52d30ddea0bd-kube-api-access-jgpdj" (OuterVolumeSpecName: "kube-api-access-jgpdj") pod "89de2b71-4ba3-4287-8689-52d30ddea0bd" (UID: "89de2b71-4ba3-4287-8689-52d30ddea0bd"). InnerVolumeSpecName "kube-api-access-jgpdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:04:04 crc kubenswrapper[4941]: I0307 07:04:04.089276 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgpdj\" (UniqueName: \"kubernetes.io/projected/89de2b71-4ba3-4287-8689-52d30ddea0bd-kube-api-access-jgpdj\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:04 crc kubenswrapper[4941]: I0307 07:04:04.519863 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547784-7g5m7" event={"ID":"89de2b71-4ba3-4287-8689-52d30ddea0bd","Type":"ContainerDied","Data":"b6fbac2feffe99e79cf1abe9b9bafd39512c3b15ca1a2415ace1fb654ccafb96"} Mar 07 07:04:04 crc kubenswrapper[4941]: I0307 07:04:04.519917 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6fbac2feffe99e79cf1abe9b9bafd39512c3b15ca1a2415ace1fb654ccafb96" Mar 07 07:04:04 crc kubenswrapper[4941]: I0307 07:04:04.519926 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547784-7g5m7" Mar 07 07:04:04 crc kubenswrapper[4941]: I0307 07:04:04.888560 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547778-x2xv2"] Mar 07 07:04:04 crc kubenswrapper[4941]: I0307 07:04:04.895612 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547778-x2xv2"] Mar 07 07:04:05 crc kubenswrapper[4941]: I0307 07:04:05.971784 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a8efd6-1008-4ba6-9db3-afb764026fc3" path="/var/lib/kubelet/pods/d8a8efd6-1008-4ba6-9db3-afb764026fc3/volumes" Mar 07 07:04:40 crc kubenswrapper[4941]: I0307 07:04:40.314280 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:04:40 crc kubenswrapper[4941]: I0307 07:04:40.315225 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:05:02 crc kubenswrapper[4941]: I0307 07:05:02.396475 4941 scope.go:117] "RemoveContainer" containerID="8ee64e9a2a790df233102d47f57d259db965b475e842ee99d53a89aa716bd946" Mar 07 07:05:10 crc kubenswrapper[4941]: I0307 07:05:10.314517 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:05:10 crc kubenswrapper[4941]: I0307 07:05:10.315585 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:05:22 crc kubenswrapper[4941]: I0307 07:05:22.865854 4941 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 07:05:40 crc kubenswrapper[4941]: I0307 07:05:40.314450 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:05:40 crc kubenswrapper[4941]: I0307 07:05:40.315035 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:05:40 crc kubenswrapper[4941]: I0307 07:05:40.315099 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 07:05:40 crc kubenswrapper[4941]: I0307 07:05:40.315873 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7fe83ca68d83b7b6bf2fa18e88a311d0a293429f704eef511a481cd353a9e5a"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:05:40 crc kubenswrapper[4941]: I0307 07:05:40.315926 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://b7fe83ca68d83b7b6bf2fa18e88a311d0a293429f704eef511a481cd353a9e5a" gracePeriod=600 Mar 07 07:05:41 crc kubenswrapper[4941]: I0307 07:05:41.169274 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="b7fe83ca68d83b7b6bf2fa18e88a311d0a293429f704eef511a481cd353a9e5a" exitCode=0 Mar 07 07:05:41 crc kubenswrapper[4941]: I0307 07:05:41.169393 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"b7fe83ca68d83b7b6bf2fa18e88a311d0a293429f704eef511a481cd353a9e5a"} Mar 07 07:05:41 crc kubenswrapper[4941]: I0307 07:05:41.170353 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"81c89fa64b6b91f6338e8315cd83a021b0214053cc3ad130bb16369071ad3bcf"} Mar 07 07:05:41 crc kubenswrapper[4941]: I0307 07:05:41.170388 4941 scope.go:117] "RemoveContainer" containerID="d746908cc3303b1fdbe53fad12169c53fe203d0fd3b7bd7e783a1191a81869a4" Mar 07 07:05:52 crc kubenswrapper[4941]: I0307 07:05:52.203514 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5ztp"] Mar 07 07:05:52 crc kubenswrapper[4941]: I0307 07:05:52.204820 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovn-controller" containerID="cri-o://5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4" gracePeriod=30 Mar 07 07:05:52 crc kubenswrapper[4941]: I0307 07:05:52.204884 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="nbdb" containerID="cri-o://e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610" gracePeriod=30 Mar 07 07:05:52 crc kubenswrapper[4941]: I0307 07:05:52.204952 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovn-acl-logging" containerID="cri-o://397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7" gracePeriod=30 Mar 07 07:05:52 crc kubenswrapper[4941]: I0307 07:05:52.204941 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="kube-rbac-proxy-node" containerID="cri-o://22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471" gracePeriod=30 Mar 07 07:05:52 crc kubenswrapper[4941]: I0307 07:05:52.204986 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="sbdb" containerID="cri-o://ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486" gracePeriod=30 Mar 07 07:05:52 crc kubenswrapper[4941]: I0307 07:05:52.205026 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="northd" containerID="cri-o://6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23" gracePeriod=30 Mar 07 07:05:52 crc kubenswrapper[4941]: I0307 07:05:52.204941 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c" gracePeriod=30 Mar 07 07:05:52 crc kubenswrapper[4941]: I0307 07:05:52.261650 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" containerID="cri-o://9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753" gracePeriod=30 Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.050583 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/3.log" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.054120 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovn-acl-logging/0.log" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.054905 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovn-controller/0.log" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.056450 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.127752 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qsnwt"] Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128057 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128073 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128081 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128087 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128096 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128104 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128117 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="sbdb" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128123 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="sbdb" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128134 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89de2b71-4ba3-4287-8689-52d30ddea0bd" containerName="oc" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128140 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="89de2b71-4ba3-4287-8689-52d30ddea0bd" containerName="oc" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128155 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovn-acl-logging" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128162 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovn-acl-logging" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128175 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="northd" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128183 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="northd" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128193 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="kube-rbac-proxy-node" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128201 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="kube-rbac-proxy-node" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128209 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="kubecfg-setup" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128216 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="kubecfg-setup" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128224 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="kube-rbac-proxy-ovn-metrics" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128230 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="kube-rbac-proxy-ovn-metrics" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128240 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="nbdb" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128246 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="nbdb" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128256 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovn-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128263 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovn-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128420 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovn-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128438 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128449 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="kube-rbac-proxy-node" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128457 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="nbdb" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128468 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128479 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="89de2b71-4ba3-4287-8689-52d30ddea0bd" containerName="oc" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128488 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="kube-rbac-proxy-ovn-metrics" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128498 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovn-acl-logging" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128506 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128515 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="sbdb" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128523 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="northd" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128532 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128544 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128665 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128678 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.128690 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.128700 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3469f59-621c-4493-ade3-768772d05ebd" containerName="ovnkube-controller" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.131105 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138115 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-cni-netd\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138214 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3469f59-621c-4493-ade3-768772d05ebd-ovn-node-metrics-cert\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138228 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138284 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-run-netns\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138401 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-node-log\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138493 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-openvswitch\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138590 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-var-lib-openvswitch\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138656 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcp6v\" (UniqueName: \"kubernetes.io/projected/c3469f59-621c-4493-ade3-768772d05ebd-kube-api-access-xcp6v\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138733 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138805 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-ovnkube-config\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138808 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138878 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-log-socket\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138912 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138901 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138941 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-log-socket" (OuterVolumeSpecName: "log-socket") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.138934 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-run-ovn-kubernetes\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139005 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139055 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-ovn\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139117 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-kubelet\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139129 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139144 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-slash\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139159 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139164 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-systemd\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139224 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-slash" (OuterVolumeSpecName: "host-slash") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139242 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139256 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-env-overrides\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139306 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-cni-bin\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139323 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-etc-openvswitch\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139348 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-ovnkube-script-lib\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139368 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-systemd-units\") pod \"c3469f59-621c-4493-ade3-768772d05ebd\" (UID: \"c3469f59-621c-4493-ade3-768772d05ebd\") " Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139382 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139489 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139614 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-run-netns\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139640 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-run-openvswitch\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139720 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-var-lib-openvswitch\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139748 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/955df727-7843-409c-800f-c07a9aa037c4-ovnkube-script-lib\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139784 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-run-ovn\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139812 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-run-systemd\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139875 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-kubelet\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139900 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-slash\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139917 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-cni-netd\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139953 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/955df727-7843-409c-800f-c07a9aa037c4-ovnkube-config\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139972 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-node-log\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.139998 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-cni-bin\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.140005 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.140017 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/955df727-7843-409c-800f-c07a9aa037c4-ovn-node-metrics-cert\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.140135 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.140211 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-systemd-units\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.140256 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpsml\" (UniqueName: \"kubernetes.io/projected/955df727-7843-409c-800f-c07a9aa037c4-kube-api-access-lpsml\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.140309 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-log-socket\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.140382 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-etc-openvswitch\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.140531 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.140579 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/955df727-7843-409c-800f-c07a9aa037c4-env-overrides\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.140040 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.140073 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-node-log" (OuterVolumeSpecName: "node-log") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.140157 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.140781 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.141203 4941 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.141231 4941 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.141242 4941 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-log-socket\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.141255 4941 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.141333 4941 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.141363 4941 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-slash\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.141386 4941 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.141567 4941 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.141590 4941 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.141609 4941 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.141629 4941 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.141647 4941 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.141666 4941 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.144798 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3469f59-621c-4493-ade3-768772d05ebd-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.145254 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3469f59-621c-4493-ade3-768772d05ebd-kube-api-access-xcp6v" (OuterVolumeSpecName: "kube-api-access-xcp6v") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "kube-api-access-xcp6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.168830 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c3469f59-621c-4493-ade3-768772d05ebd" (UID: "c3469f59-621c-4493-ade3-768772d05ebd"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.243799 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-kubelet\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.243861 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-slash\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.243886 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-cni-netd\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.243908 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-node-log\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.243926 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/955df727-7843-409c-800f-c07a9aa037c4-ovnkube-config\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.243953 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-cni-bin\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.243974 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/955df727-7843-409c-800f-c07a9aa037c4-ovn-node-metrics-cert\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.243996 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244009 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-kubelet\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244079 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-systemd-units\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244087 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-slash\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244121 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-cni-bin\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244218 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-node-log\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244220 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-cni-netd\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244166 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244029 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-systemd-units\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244312 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpsml\" (UniqueName: \"kubernetes.io/projected/955df727-7843-409c-800f-c07a9aa037c4-kube-api-access-lpsml\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244357 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-log-socket\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244427 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-etc-openvswitch\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244487 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244547 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/955df727-7843-409c-800f-c07a9aa037c4-env-overrides\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244596 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-run-netns\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244625 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-run-openvswitch\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244696 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-var-lib-openvswitch\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244715 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-log-socket\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244735 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/955df727-7843-409c-800f-c07a9aa037c4-ovnkube-script-lib\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244862 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-run-ovn\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.244897 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-run-systemd\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245014 4941 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3469f59-621c-4493-ade3-768772d05ebd-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245028 4941 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245039 4941 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3469f59-621c-4493-ade3-768772d05ebd-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245051 4941 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-node-log\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245075 4941 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245095 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcp6v\" (UniqueName: \"kubernetes.io/projected/c3469f59-621c-4493-ade3-768772d05ebd-kube-api-access-xcp6v\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245109 4941 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3469f59-621c-4493-ade3-768772d05ebd-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245008 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/955df727-7843-409c-800f-c07a9aa037c4-ovnkube-config\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245292 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/955df727-7843-409c-800f-c07a9aa037c4-ovnkube-script-lib\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245304 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-run-openvswitch\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245359 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-var-lib-openvswitch\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245367 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245375 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-etc-openvswitch\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245429 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-run-systemd\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245415 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/955df727-7843-409c-800f-c07a9aa037c4-env-overrides\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245429 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-host-run-netns\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.245523 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/955df727-7843-409c-800f-c07a9aa037c4-run-ovn\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.248265 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/955df727-7843-409c-800f-c07a9aa037c4-ovn-node-metrics-cert\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.260894 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovnkube-controller/3.log" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.264426 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovn-acl-logging/0.log" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.265168 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5ztp_c3469f59-621c-4493-ade3-768772d05ebd/ovn-controller/0.log" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.265822 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3469f59-621c-4493-ade3-768772d05ebd" containerID="9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753" exitCode=0 Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.265900 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3469f59-621c-4493-ade3-768772d05ebd" containerID="ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486" exitCode=0 Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.265935 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3469f59-621c-4493-ade3-768772d05ebd" containerID="e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610" exitCode=0 Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.265949 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3469f59-621c-4493-ade3-768772d05ebd" containerID="6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23" exitCode=0 Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.265960 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3469f59-621c-4493-ade3-768772d05ebd" containerID="3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c" exitCode=0 Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.265972 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3469f59-621c-4493-ade3-768772d05ebd" containerID="22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471" exitCode=0 Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.265949 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266056 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.265986 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3469f59-621c-4493-ade3-768772d05ebd" containerID="397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7" exitCode=143 Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266103 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266131 4941 scope.go:117] "RemoveContainer" containerID="9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266185 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266219 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266235 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266250 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266264 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266282 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266289 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266296 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266304 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266310 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266316 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266322 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266328 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266336 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266347 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266355 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266107 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3469f59-621c-4493-ade3-768772d05ebd" containerID="5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4" exitCode=143 Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266362 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266475 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266494 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266506 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266532 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266542 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266549 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266557 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266615 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266661 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266669 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266698 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266707 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266714 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266721 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266727 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266734 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266741 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266748 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266778 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5ztp" event={"ID":"c3469f59-621c-4493-ade3-768772d05ebd","Type":"ContainerDied","Data":"5e9b053798d2692856ad88ba678cdc393c3bd826be5094ccd0ebad12f91e397e"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266793 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266803 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266818 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266825 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266852 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266862 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266871 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266880 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266888 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.266949 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.271487 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpsml\" (UniqueName: \"kubernetes.io/projected/955df727-7843-409c-800f-c07a9aa037c4-kube-api-access-lpsml\") pod \"ovnkube-node-qsnwt\" (UID: \"955df727-7843-409c-800f-c07a9aa037c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.271650 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kc9rw_ed82bc0c-1609-449c-b2e2-2fe04af9749d/kube-multus/2.log" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.272380 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kc9rw_ed82bc0c-1609-449c-b2e2-2fe04af9749d/kube-multus/1.log" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.272472 4941 generic.go:334] "Generic (PLEG): container finished" podID="ed82bc0c-1609-449c-b2e2-2fe04af9749d" containerID="308a567803d153cdae67d929aaf40daee703177d7713986faf0d6ac3e4e79eb6" exitCode=2 Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.272536 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kc9rw" event={"ID":"ed82bc0c-1609-449c-b2e2-2fe04af9749d","Type":"ContainerDied","Data":"308a567803d153cdae67d929aaf40daee703177d7713986faf0d6ac3e4e79eb6"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.272597 4941 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c8147115ce051f627cdbae790a127df650235a4ea68236d79581d1848d46261"} Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.273514 4941 scope.go:117] "RemoveContainer" containerID="308a567803d153cdae67d929aaf40daee703177d7713986faf0d6ac3e4e79eb6" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.306143 4941 scope.go:117] "RemoveContainer" containerID="278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.324074 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5ztp"] Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.327312 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5ztp"] Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.378030 4941 scope.go:117] "RemoveContainer" containerID="ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.398171 4941 scope.go:117] "RemoveContainer" containerID="e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.416645 4941 scope.go:117] "RemoveContainer" containerID="6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.437460 4941 scope.go:117] "RemoveContainer" containerID="3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.452057 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.454153 4941 scope.go:117] "RemoveContainer" containerID="22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.467728 4941 scope.go:117] "RemoveContainer" containerID="397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7" Mar 07 07:05:53 crc kubenswrapper[4941]: W0307 07:05:53.476297 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod955df727_7843_409c_800f_c07a9aa037c4.slice/crio-3005e490413af14074cb43c2e61c6659b3e9d41c03982921ee741f4133c6ce80 WatchSource:0}: Error finding container 3005e490413af14074cb43c2e61c6659b3e9d41c03982921ee741f4133c6ce80: Status 404 returned error can't find the container with id 3005e490413af14074cb43c2e61c6659b3e9d41c03982921ee741f4133c6ce80 Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.537240 4941 scope.go:117] "RemoveContainer" containerID="5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.559812 4941 scope.go:117] "RemoveContainer" containerID="3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.599931 4941 scope.go:117] "RemoveContainer" containerID="9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.600741 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753\": container with ID starting with 9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753 not found: ID does not exist" containerID="9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.600797 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753"} err="failed to get container status \"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753\": rpc error: code = NotFound desc = could not find container \"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753\": container with ID starting with 9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.600832 4941 scope.go:117] "RemoveContainer" containerID="278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.602036 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\": container with ID starting with 278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68 not found: ID does not exist" containerID="278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.602145 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68"} err="failed to get container status \"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\": rpc error: code = NotFound desc = could not find container \"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\": container with ID starting with 278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.602212 4941 scope.go:117] "RemoveContainer" containerID="ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.602852 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\": container with ID starting with ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486 not found: ID does not exist" containerID="ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.602904 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486"} err="failed to get container status \"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\": rpc error: code = NotFound desc = could not find container \"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\": container with ID starting with ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.602949 4941 scope.go:117] "RemoveContainer" containerID="e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.603296 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\": container with ID starting with e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610 not found: ID does not exist" containerID="e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.603324 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610"} err="failed to get container status \"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\": rpc error: code = NotFound desc = could not find container \"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\": container with ID starting with e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.603345 4941 scope.go:117] "RemoveContainer" containerID="6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.603652 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\": container with ID starting with 6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23 not found: ID does not exist" containerID="6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.603673 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23"} err="failed to get container status \"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\": rpc error: code = NotFound desc = could not find container \"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\": container with ID starting with 6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.603687 4941 scope.go:117] "RemoveContainer" containerID="3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.603951 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\": container with ID starting with 3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c not found: ID does not exist" containerID="3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.603982 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c"} err="failed to get container status \"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\": rpc error: code = NotFound desc = could not find container \"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\": container with ID starting with 3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.604005 4941 scope.go:117] "RemoveContainer" containerID="22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.604627 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\": container with ID starting with 22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471 not found: ID does not exist" containerID="22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.604648 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471"} err="failed to get container status \"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\": rpc error: code = NotFound desc = could not find container \"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\": container with ID starting with 22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.604662 4941 scope.go:117] "RemoveContainer" containerID="397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.605090 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\": container with ID starting with 397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7 not found: ID does not exist" containerID="397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.605122 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7"} err="failed to get container status \"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\": rpc error: code = NotFound desc = could not find container \"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\": container with ID starting with 397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.605143 4941 scope.go:117] "RemoveContainer" containerID="5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.605552 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\": container with ID starting with 5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4 not found: ID does not exist" containerID="5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.605574 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4"} err="failed to get container status \"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\": rpc error: code = NotFound desc = could not find container \"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\": container with ID starting with 5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.605597 4941 scope.go:117] "RemoveContainer" containerID="3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df" Mar 07 07:05:53 crc kubenswrapper[4941]: E0307 07:05:53.605951 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\": container with ID starting with 3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df not found: ID does not exist" containerID="3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.606012 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df"} err="failed to get container status \"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\": rpc error: code = NotFound desc = could not find container \"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\": container with ID starting with 3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.606059 4941 scope.go:117] "RemoveContainer" containerID="9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.606388 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753"} err="failed to get container status \"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753\": rpc error: code = NotFound desc = could not find container \"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753\": container with ID starting with 9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.606454 4941 scope.go:117] "RemoveContainer" containerID="278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.606764 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68"} err="failed to get container status \"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\": rpc error: code = NotFound desc = could not find container \"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\": container with ID starting with 278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.606794 4941 scope.go:117] "RemoveContainer" containerID="ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.607075 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486"} err="failed to get container status \"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\": rpc error: code = NotFound desc = could not find container \"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\": container with ID starting with ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.607096 4941 scope.go:117] "RemoveContainer" containerID="e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.607315 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610"} err="failed to get container status \"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\": rpc error: code = NotFound desc = could not find container \"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\": container with ID starting with e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.607328 4941 scope.go:117] "RemoveContainer" containerID="6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.607652 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23"} err="failed to get container status \"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\": rpc error: code = NotFound desc = could not find container \"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\": container with ID starting with 6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.607682 4941 scope.go:117] "RemoveContainer" containerID="3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.607995 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c"} err="failed to get container status \"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\": rpc error: code = NotFound desc = could not find container \"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\": container with ID starting with 3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.608009 4941 scope.go:117] "RemoveContainer" containerID="22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.608650 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471"} err="failed to get container status \"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\": rpc error: code = NotFound desc = could not find container \"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\": container with ID starting with 22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.608676 4941 scope.go:117] "RemoveContainer" containerID="397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.608971 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7"} err="failed to get container status \"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\": rpc error: code = NotFound desc = could not find container \"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\": container with ID starting with 397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.609002 4941 scope.go:117] "RemoveContainer" containerID="5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.609494 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4"} err="failed to get container status \"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\": rpc error: code = NotFound desc = could not find container \"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\": container with ID starting with 5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.609513 4941 scope.go:117] "RemoveContainer" containerID="3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.609825 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df"} err="failed to get container status \"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\": rpc error: code = NotFound desc = could not find container \"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\": container with ID starting with 3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.609848 4941 scope.go:117] "RemoveContainer" containerID="9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.610125 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753"} err="failed to get container status \"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753\": rpc error: code = NotFound desc = could not find container \"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753\": container with ID starting with 9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.610170 4941 scope.go:117] "RemoveContainer" containerID="278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.610575 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68"} err="failed to get container status \"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\": rpc error: code = NotFound desc = could not find container \"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\": container with ID starting with 278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.610600 4941 scope.go:117] "RemoveContainer" containerID="ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.610856 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486"} err="failed to get container status \"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\": rpc error: code = NotFound desc = could not find container \"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\": container with ID starting with ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.610874 4941 scope.go:117] "RemoveContainer" containerID="e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.611181 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610"} err="failed to get container status \"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\": rpc error: code = NotFound desc = could not find container \"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\": container with ID starting with e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.611207 4941 scope.go:117] "RemoveContainer" containerID="6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.611675 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23"} err="failed to get container status \"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\": rpc error: code = NotFound desc = could not find container \"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\": container with ID starting with 6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.611701 4941 scope.go:117] "RemoveContainer" containerID="3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.612037 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c"} err="failed to get container status \"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\": rpc error: code = NotFound desc = could not find container \"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\": container with ID starting with 3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.612057 4941 scope.go:117] "RemoveContainer" containerID="22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.612630 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471"} err="failed to get container status \"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\": rpc error: code = NotFound desc = could not find container \"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\": container with ID starting with 22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.612650 4941 scope.go:117] "RemoveContainer" containerID="397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.613035 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7"} err="failed to get container status \"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\": rpc error: code = NotFound desc = could not find container \"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\": container with ID starting with 397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.613051 4941 scope.go:117] "RemoveContainer" containerID="5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.613561 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4"} err="failed to get container status \"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\": rpc error: code = NotFound desc = could not find container \"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\": container with ID starting with 5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.613598 4941 scope.go:117] "RemoveContainer" containerID="3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.613885 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df"} err="failed to get container status \"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\": rpc error: code = NotFound desc = could not find container \"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\": container with ID starting with 3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.613916 4941 scope.go:117] "RemoveContainer" containerID="9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.614504 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753"} err="failed to get container status \"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753\": rpc error: code = NotFound desc = could not find container \"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753\": container with ID starting with 9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.614535 4941 scope.go:117] "RemoveContainer" containerID="278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.614917 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68"} err="failed to get container status \"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\": rpc error: code = NotFound desc = could not find container \"278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68\": container with ID starting with 278bfb3c1a016361c60d7ce4208414303569bdd6413c94b9c85fc2ac073a6a68 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.614936 4941 scope.go:117] "RemoveContainer" containerID="ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.615349 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486"} err="failed to get container status \"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\": rpc error: code = NotFound desc = could not find container \"ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486\": container with ID starting with ca964cc58d81f921960c79145fbf7ae8535bef7dfb681b6ffba7d610a3e9b486 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.615362 4941 scope.go:117] "RemoveContainer" containerID="e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.615714 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610"} err="failed to get container status \"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\": rpc error: code = NotFound desc = could not find container \"e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610\": container with ID starting with e6b9b1a1ca73ec8892f65c6cbb037db60403fbd5ec38844ce87c76d1c2555610 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.615751 4941 scope.go:117] "RemoveContainer" containerID="6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.616075 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23"} err="failed to get container status \"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\": rpc error: code = NotFound desc = could not find container \"6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23\": container with ID starting with 6975f54199db3badd19bfb604bbd619222102d706fc9c9ff840e92419ab8dd23 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.616093 4941 scope.go:117] "RemoveContainer" containerID="3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.616371 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c"} err="failed to get container status \"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\": rpc error: code = NotFound desc = could not find container \"3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c\": container with ID starting with 3542bcee76c6fa2003a0841b1984563cbc0c21d919d2043c9535b763a25b845c not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.616396 4941 scope.go:117] "RemoveContainer" containerID="22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.616758 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471"} err="failed to get container status \"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\": rpc error: code = NotFound desc = could not find container \"22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471\": container with ID starting with 22b7f5a94272cc910381735b0e0fbffd71d57d2f54d243613a03af2c0f349471 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.616776 4941 scope.go:117] "RemoveContainer" containerID="397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.617043 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7"} err="failed to get container status \"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\": rpc error: code = NotFound desc = could not find container \"397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7\": container with ID starting with 397e5976ebd58fadb2b724fe09169156ab700ae596fa0d417ff009d4df5608d7 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.617064 4941 scope.go:117] "RemoveContainer" containerID="5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.617559 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4"} err="failed to get container status \"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\": rpc error: code = NotFound desc = could not find container \"5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4\": container with ID starting with 5c3b51d7e9563e73ebd679dc07fabaf41cbcae47553b4c262aec07bc1a1ea2e4 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.617581 4941 scope.go:117] "RemoveContainer" containerID="3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.618832 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df"} err="failed to get container status \"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\": rpc error: code = NotFound desc = could not find container \"3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df\": container with ID starting with 3bea33cce0953aac81f8d107cc6b5288fe376263524f9d2c19dd2c9a66b609df not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.618858 4941 scope.go:117] "RemoveContainer" containerID="9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.619670 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753"} err="failed to get container status \"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753\": rpc error: code = NotFound desc = could not find container \"9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753\": container with ID starting with 9c806813adfebb222da210a3f10ca9fd3a21253c2cc226fab3bfdef57adf7753 not found: ID does not exist" Mar 07 07:05:53 crc kubenswrapper[4941]: I0307 07:05:53.962323 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3469f59-621c-4493-ade3-768772d05ebd" path="/var/lib/kubelet/pods/c3469f59-621c-4493-ade3-768772d05ebd/volumes" Mar 07 07:05:54 crc kubenswrapper[4941]: I0307 07:05:54.282274 4941 generic.go:334] "Generic (PLEG): container finished" podID="955df727-7843-409c-800f-c07a9aa037c4" containerID="d1a2fc8cb106e313bd7e7eecebfabc4b27f5345410ded3651be813741db27809" exitCode=0 Mar 07 07:05:54 crc kubenswrapper[4941]: I0307 07:05:54.282356 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" event={"ID":"955df727-7843-409c-800f-c07a9aa037c4","Type":"ContainerDied","Data":"d1a2fc8cb106e313bd7e7eecebfabc4b27f5345410ded3651be813741db27809"} Mar 07 07:05:54 crc kubenswrapper[4941]: I0307 07:05:54.282394 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" event={"ID":"955df727-7843-409c-800f-c07a9aa037c4","Type":"ContainerStarted","Data":"3005e490413af14074cb43c2e61c6659b3e9d41c03982921ee741f4133c6ce80"} Mar 07 07:05:54 crc kubenswrapper[4941]: I0307 07:05:54.287577 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kc9rw_ed82bc0c-1609-449c-b2e2-2fe04af9749d/kube-multus/2.log" Mar 07 07:05:54 crc kubenswrapper[4941]: I0307 07:05:54.289471 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kc9rw_ed82bc0c-1609-449c-b2e2-2fe04af9749d/kube-multus/1.log" Mar 07 07:05:54 crc kubenswrapper[4941]: I0307 07:05:54.289747 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kc9rw" event={"ID":"ed82bc0c-1609-449c-b2e2-2fe04af9749d","Type":"ContainerStarted","Data":"eb7de7b7892f2e7ebffa2f2ad6aa1b318b223b20f9aec7997c49efb259a7c39d"} Mar 07 07:05:55 crc kubenswrapper[4941]: I0307 07:05:55.301072 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" event={"ID":"955df727-7843-409c-800f-c07a9aa037c4","Type":"ContainerStarted","Data":"9c287abec6d0f1527772d9f66ce8328f745c92e3e8e7f729233cd0777f95caaf"} Mar 07 07:05:55 crc kubenswrapper[4941]: I0307 07:05:55.301444 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" event={"ID":"955df727-7843-409c-800f-c07a9aa037c4","Type":"ContainerStarted","Data":"53eb44213d6eb58897ac06640ef0f4759eb07bc199faa64393ce9d9b8f4fec79"} Mar 07 07:05:55 crc kubenswrapper[4941]: I0307 07:05:55.301466 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" event={"ID":"955df727-7843-409c-800f-c07a9aa037c4","Type":"ContainerStarted","Data":"abdc6a25ad302e1b17026a0575b1a47027153d584faa6cef44dc59b01f2c9674"} Mar 07 07:05:55 crc kubenswrapper[4941]: I0307 07:05:55.301479 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" event={"ID":"955df727-7843-409c-800f-c07a9aa037c4","Type":"ContainerStarted","Data":"c0d0dfde9b87e6f22a967e8adbc6cbbbda88c4e14fa053cffb8857fa09595db2"} Mar 07 07:05:56 crc kubenswrapper[4941]: I0307 07:05:56.309973 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" event={"ID":"955df727-7843-409c-800f-c07a9aa037c4","Type":"ContainerStarted","Data":"206de4c2bf008ec3b820d74af6df4eedc80d7458e7f6062283bc16694834aeff"} Mar 07 07:05:56 crc kubenswrapper[4941]: I0307 07:05:56.310712 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" event={"ID":"955df727-7843-409c-800f-c07a9aa037c4","Type":"ContainerStarted","Data":"9c89356d1b3f3eb7dc5aab8c36dd9ac7156fd8b6bbd9cb346bb4bfa23a2084a3"} Mar 07 07:05:58 crc kubenswrapper[4941]: I0307 07:05:58.330109 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" event={"ID":"955df727-7843-409c-800f-c07a9aa037c4","Type":"ContainerStarted","Data":"052efa0c7b68f07f081c00ef17645eb207681105ea1de2798d6aaabf93fa4b5d"} Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.143989 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547786-dwbsp"] Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.145681 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.148181 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.148549 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.151958 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.246000 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg4m4\" (UniqueName: \"kubernetes.io/projected/46ddec52-6876-4e01-975f-0c00387eba75-kube-api-access-pg4m4\") pod \"auto-csr-approver-29547786-dwbsp\" (UID: \"46ddec52-6876-4e01-975f-0c00387eba75\") " pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.347221 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg4m4\" (UniqueName: \"kubernetes.io/projected/46ddec52-6876-4e01-975f-0c00387eba75-kube-api-access-pg4m4\") pod \"auto-csr-approver-29547786-dwbsp\" (UID: \"46ddec52-6876-4e01-975f-0c00387eba75\") " pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.350042 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" event={"ID":"955df727-7843-409c-800f-c07a9aa037c4","Type":"ContainerStarted","Data":"64a2c73422e652e2266d46f03d14b361e8e2f3d5c15b4a3f9bcf7c339c58fd55"} Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.350582 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.350676 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.350688 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.378856 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg4m4\" (UniqueName: \"kubernetes.io/projected/46ddec52-6876-4e01-975f-0c00387eba75-kube-api-access-pg4m4\") pod \"auto-csr-approver-29547786-dwbsp\" (UID: \"46ddec52-6876-4e01-975f-0c00387eba75\") " pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.398750 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.401762 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.438661 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" podStartSLOduration=7.438640603 podStartE2EDuration="7.438640603s" podCreationTimestamp="2026-03-07 07:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:06:00.392678678 +0000 UTC m=+857.345044143" watchObservedRunningTime="2026-03-07 07:06:00.438640603 +0000 UTC m=+857.391006068" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.484211 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:00 crc kubenswrapper[4941]: E0307 07:06:00.519297 4941 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29547786-dwbsp_openshift-infra_46ddec52-6876-4e01-975f-0c00387eba75_0(608999de76138b3100675a55f898cdfd3358ba966fb1967264045aa9d379b2ac): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:06:00 crc kubenswrapper[4941]: E0307 07:06:00.519846 4941 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29547786-dwbsp_openshift-infra_46ddec52-6876-4e01-975f-0c00387eba75_0(608999de76138b3100675a55f898cdfd3358ba966fb1967264045aa9d379b2ac): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:00 crc kubenswrapper[4941]: E0307 07:06:00.519871 4941 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29547786-dwbsp_openshift-infra_46ddec52-6876-4e01-975f-0c00387eba75_0(608999de76138b3100675a55f898cdfd3358ba966fb1967264045aa9d379b2ac): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:00 crc kubenswrapper[4941]: E0307 07:06:00.519924 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29547786-dwbsp_openshift-infra(46ddec52-6876-4e01-975f-0c00387eba75)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29547786-dwbsp_openshift-infra(46ddec52-6876-4e01-975f-0c00387eba75)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29547786-dwbsp_openshift-infra_46ddec52-6876-4e01-975f-0c00387eba75_0(608999de76138b3100675a55f898cdfd3358ba966fb1967264045aa9d379b2ac): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" podUID="46ddec52-6876-4e01-975f-0c00387eba75" Mar 07 07:06:00 crc kubenswrapper[4941]: I0307 07:06:00.971114 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547786-dwbsp"] Mar 07 07:06:01 crc kubenswrapper[4941]: I0307 07:06:01.366548 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:01 crc kubenswrapper[4941]: I0307 07:06:01.368913 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:01 crc kubenswrapper[4941]: E0307 07:06:01.407190 4941 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29547786-dwbsp_openshift-infra_46ddec52-6876-4e01-975f-0c00387eba75_0(dc5f5c0d8500c2fd76834e73ee77cc2607ec25055bc11493931cd0e1b527ff27): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:06:01 crc kubenswrapper[4941]: E0307 07:06:01.407292 4941 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29547786-dwbsp_openshift-infra_46ddec52-6876-4e01-975f-0c00387eba75_0(dc5f5c0d8500c2fd76834e73ee77cc2607ec25055bc11493931cd0e1b527ff27): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:01 crc kubenswrapper[4941]: E0307 07:06:01.407330 4941 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29547786-dwbsp_openshift-infra_46ddec52-6876-4e01-975f-0c00387eba75_0(dc5f5c0d8500c2fd76834e73ee77cc2607ec25055bc11493931cd0e1b527ff27): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:01 crc kubenswrapper[4941]: E0307 07:06:01.407439 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29547786-dwbsp_openshift-infra(46ddec52-6876-4e01-975f-0c00387eba75)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29547786-dwbsp_openshift-infra(46ddec52-6876-4e01-975f-0c00387eba75)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29547786-dwbsp_openshift-infra_46ddec52-6876-4e01-975f-0c00387eba75_0(dc5f5c0d8500c2fd76834e73ee77cc2607ec25055bc11493931cd0e1b527ff27): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" podUID="46ddec52-6876-4e01-975f-0c00387eba75" Mar 07 07:06:02 crc kubenswrapper[4941]: I0307 07:06:02.506459 4941 scope.go:117] "RemoveContainer" containerID="9c8147115ce051f627cdbae790a127df650235a4ea68236d79581d1848d46261" Mar 07 07:06:03 crc kubenswrapper[4941]: I0307 07:06:03.381211 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kc9rw_ed82bc0c-1609-449c-b2e2-2fe04af9749d/kube-multus/2.log" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.019604 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-xg6d8"] Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.021385 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xg6d8" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.023936 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-xg6d8"] Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.024152 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.024376 4941 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-5bgl8" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.024555 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.024670 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.213695 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a4066aa8-776a-435a-a796-4c7e778984a2-crc-storage\") pod \"crc-storage-crc-xg6d8\" (UID: \"a4066aa8-776a-435a-a796-4c7e778984a2\") " pod="crc-storage/crc-storage-crc-xg6d8" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.213764 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jgb9\" (UniqueName: \"kubernetes.io/projected/a4066aa8-776a-435a-a796-4c7e778984a2-kube-api-access-4jgb9\") pod \"crc-storage-crc-xg6d8\" (UID: \"a4066aa8-776a-435a-a796-4c7e778984a2\") " pod="crc-storage/crc-storage-crc-xg6d8" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.213815 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a4066aa8-776a-435a-a796-4c7e778984a2-node-mnt\") pod \"crc-storage-crc-xg6d8\" (UID: \"a4066aa8-776a-435a-a796-4c7e778984a2\") " pod="crc-storage/crc-storage-crc-xg6d8" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.315992 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a4066aa8-776a-435a-a796-4c7e778984a2-crc-storage\") pod \"crc-storage-crc-xg6d8\" (UID: \"a4066aa8-776a-435a-a796-4c7e778984a2\") " pod="crc-storage/crc-storage-crc-xg6d8" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.316081 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jgb9\" (UniqueName: \"kubernetes.io/projected/a4066aa8-776a-435a-a796-4c7e778984a2-kube-api-access-4jgb9\") pod \"crc-storage-crc-xg6d8\" (UID: \"a4066aa8-776a-435a-a796-4c7e778984a2\") " pod="crc-storage/crc-storage-crc-xg6d8" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.316118 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a4066aa8-776a-435a-a796-4c7e778984a2-node-mnt\") pod \"crc-storage-crc-xg6d8\" (UID: \"a4066aa8-776a-435a-a796-4c7e778984a2\") " pod="crc-storage/crc-storage-crc-xg6d8" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.316581 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a4066aa8-776a-435a-a796-4c7e778984a2-node-mnt\") pod \"crc-storage-crc-xg6d8\" (UID: \"a4066aa8-776a-435a-a796-4c7e778984a2\") " pod="crc-storage/crc-storage-crc-xg6d8" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.317998 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a4066aa8-776a-435a-a796-4c7e778984a2-crc-storage\") pod \"crc-storage-crc-xg6d8\" (UID: \"a4066aa8-776a-435a-a796-4c7e778984a2\") " pod="crc-storage/crc-storage-crc-xg6d8" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.344879 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jgb9\" (UniqueName: \"kubernetes.io/projected/a4066aa8-776a-435a-a796-4c7e778984a2-kube-api-access-4jgb9\") pod \"crc-storage-crc-xg6d8\" (UID: \"a4066aa8-776a-435a-a796-4c7e778984a2\") " pod="crc-storage/crc-storage-crc-xg6d8" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.345508 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xg6d8" Mar 07 07:06:05 crc kubenswrapper[4941]: I0307 07:06:05.597131 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-xg6d8"] Mar 07 07:06:05 crc kubenswrapper[4941]: W0307 07:06:05.603196 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4066aa8_776a_435a_a796_4c7e778984a2.slice/crio-1c41fdd203ca35d1e2f37cb4070429b21add810be3fbd2a4a67caac3ed5113b3 WatchSource:0}: Error finding container 1c41fdd203ca35d1e2f37cb4070429b21add810be3fbd2a4a67caac3ed5113b3: Status 404 returned error can't find the container with id 1c41fdd203ca35d1e2f37cb4070429b21add810be3fbd2a4a67caac3ed5113b3 Mar 07 07:06:06 crc kubenswrapper[4941]: I0307 07:06:06.401516 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xg6d8" event={"ID":"a4066aa8-776a-435a-a796-4c7e778984a2","Type":"ContainerStarted","Data":"1c41fdd203ca35d1e2f37cb4070429b21add810be3fbd2a4a67caac3ed5113b3"} Mar 07 07:06:07 crc kubenswrapper[4941]: I0307 07:06:07.411250 4941 generic.go:334] "Generic (PLEG): container finished" podID="a4066aa8-776a-435a-a796-4c7e778984a2" containerID="984a768dd4d1148f305d18f90cc65fa007db1f05c5e993a3b76d3b3eba613ff1" exitCode=0 Mar 07 07:06:07 crc kubenswrapper[4941]: I0307 07:06:07.411376 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xg6d8" event={"ID":"a4066aa8-776a-435a-a796-4c7e778984a2","Type":"ContainerDied","Data":"984a768dd4d1148f305d18f90cc65fa007db1f05c5e993a3b76d3b3eba613ff1"} Mar 07 07:06:08 crc kubenswrapper[4941]: I0307 07:06:08.641455 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xg6d8" Mar 07 07:06:08 crc kubenswrapper[4941]: I0307 07:06:08.765022 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jgb9\" (UniqueName: \"kubernetes.io/projected/a4066aa8-776a-435a-a796-4c7e778984a2-kube-api-access-4jgb9\") pod \"a4066aa8-776a-435a-a796-4c7e778984a2\" (UID: \"a4066aa8-776a-435a-a796-4c7e778984a2\") " Mar 07 07:06:08 crc kubenswrapper[4941]: I0307 07:06:08.765106 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a4066aa8-776a-435a-a796-4c7e778984a2-node-mnt\") pod \"a4066aa8-776a-435a-a796-4c7e778984a2\" (UID: \"a4066aa8-776a-435a-a796-4c7e778984a2\") " Mar 07 07:06:08 crc kubenswrapper[4941]: I0307 07:06:08.765237 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a4066aa8-776a-435a-a796-4c7e778984a2-crc-storage\") pod \"a4066aa8-776a-435a-a796-4c7e778984a2\" (UID: \"a4066aa8-776a-435a-a796-4c7e778984a2\") " Mar 07 07:06:08 crc kubenswrapper[4941]: I0307 07:06:08.765329 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4066aa8-776a-435a-a796-4c7e778984a2-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "a4066aa8-776a-435a-a796-4c7e778984a2" (UID: "a4066aa8-776a-435a-a796-4c7e778984a2"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:06:08 crc kubenswrapper[4941]: I0307 07:06:08.765546 4941 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a4066aa8-776a-435a-a796-4c7e778984a2-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:08 crc kubenswrapper[4941]: I0307 07:06:08.771983 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4066aa8-776a-435a-a796-4c7e778984a2-kube-api-access-4jgb9" (OuterVolumeSpecName: "kube-api-access-4jgb9") pod "a4066aa8-776a-435a-a796-4c7e778984a2" (UID: "a4066aa8-776a-435a-a796-4c7e778984a2"). InnerVolumeSpecName "kube-api-access-4jgb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:08 crc kubenswrapper[4941]: I0307 07:06:08.784029 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4066aa8-776a-435a-a796-4c7e778984a2-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "a4066aa8-776a-435a-a796-4c7e778984a2" (UID: "a4066aa8-776a-435a-a796-4c7e778984a2"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:06:08 crc kubenswrapper[4941]: I0307 07:06:08.867226 4941 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a4066aa8-776a-435a-a796-4c7e778984a2-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:08 crc kubenswrapper[4941]: I0307 07:06:08.867259 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jgb9\" (UniqueName: \"kubernetes.io/projected/a4066aa8-776a-435a-a796-4c7e778984a2-kube-api-access-4jgb9\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:09 crc kubenswrapper[4941]: I0307 07:06:09.428086 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xg6d8" event={"ID":"a4066aa8-776a-435a-a796-4c7e778984a2","Type":"ContainerDied","Data":"1c41fdd203ca35d1e2f37cb4070429b21add810be3fbd2a4a67caac3ed5113b3"} Mar 07 07:06:09 crc kubenswrapper[4941]: I0307 07:06:09.428818 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c41fdd203ca35d1e2f37cb4070429b21add810be3fbd2a4a67caac3ed5113b3" Mar 07 07:06:09 crc kubenswrapper[4941]: I0307 07:06:09.428248 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xg6d8" Mar 07 07:06:12 crc kubenswrapper[4941]: I0307 07:06:12.954253 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:12 crc kubenswrapper[4941]: I0307 07:06:12.954987 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:13 crc kubenswrapper[4941]: I0307 07:06:13.216953 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547786-dwbsp"] Mar 07 07:06:13 crc kubenswrapper[4941]: W0307 07:06:13.224761 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46ddec52_6876_4e01_975f_0c00387eba75.slice/crio-7d24bed0a12b0ce299c9961a52d4bdf835e7d08db0263ec978e6231c33c12b07 WatchSource:0}: Error finding container 7d24bed0a12b0ce299c9961a52d4bdf835e7d08db0263ec978e6231c33c12b07: Status 404 returned error can't find the container with id 7d24bed0a12b0ce299c9961a52d4bdf835e7d08db0263ec978e6231c33c12b07 Mar 07 07:06:13 crc kubenswrapper[4941]: I0307 07:06:13.453773 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" event={"ID":"46ddec52-6876-4e01-975f-0c00387eba75","Type":"ContainerStarted","Data":"7d24bed0a12b0ce299c9961a52d4bdf835e7d08db0263ec978e6231c33c12b07"} Mar 07 07:06:14 crc kubenswrapper[4941]: I0307 07:06:14.459167 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" event={"ID":"46ddec52-6876-4e01-975f-0c00387eba75","Type":"ContainerStarted","Data":"6558c83382c07dad799d650a90899b82c4b381543d3e298c3a2126b0ed70b318"} Mar 07 07:06:14 crc kubenswrapper[4941]: I0307 07:06:14.475239 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" podStartSLOduration=13.58672845 podStartE2EDuration="14.475213028s" podCreationTimestamp="2026-03-07 07:06:00 +0000 UTC" firstStartedPulling="2026-03-07 07:06:13.228782098 +0000 UTC m=+870.181147563" lastFinishedPulling="2026-03-07 07:06:14.117266666 +0000 UTC m=+871.069632141" observedRunningTime="2026-03-07 07:06:14.474846729 +0000 UTC m=+871.427212194" watchObservedRunningTime="2026-03-07 07:06:14.475213028 +0000 UTC m=+871.427578503" Mar 07 07:06:15 crc kubenswrapper[4941]: I0307 07:06:15.467190 4941 generic.go:334] "Generic (PLEG): container finished" podID="46ddec52-6876-4e01-975f-0c00387eba75" containerID="6558c83382c07dad799d650a90899b82c4b381543d3e298c3a2126b0ed70b318" exitCode=0 Mar 07 07:06:15 crc kubenswrapper[4941]: I0307 07:06:15.467250 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" event={"ID":"46ddec52-6876-4e01-975f-0c00387eba75","Type":"ContainerDied","Data":"6558c83382c07dad799d650a90899b82c4b381543d3e298c3a2126b0ed70b318"} Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.338440 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g"] Mar 07 07:06:16 crc kubenswrapper[4941]: E0307 07:06:16.338981 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4066aa8-776a-435a-a796-4c7e778984a2" containerName="storage" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.338992 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4066aa8-776a-435a-a796-4c7e778984a2" containerName="storage" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.339096 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4066aa8-776a-435a-a796-4c7e778984a2" containerName="storage" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.339805 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.342228 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.349349 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g"] Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.370747 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g\" (UID: \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.370785 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g\" (UID: \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.370806 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dm6f\" (UniqueName: \"kubernetes.io/projected/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-kube-api-access-5dm6f\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g\" (UID: \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.472284 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g\" (UID: \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.472337 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g\" (UID: \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.472360 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dm6f\" (UniqueName: \"kubernetes.io/projected/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-kube-api-access-5dm6f\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g\" (UID: \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.472919 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g\" (UID: \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.473112 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g\" (UID: \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.514966 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dm6f\" (UniqueName: \"kubernetes.io/projected/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-kube-api-access-5dm6f\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g\" (UID: \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.721995 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.733086 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.881447 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg4m4\" (UniqueName: \"kubernetes.io/projected/46ddec52-6876-4e01-975f-0c00387eba75-kube-api-access-pg4m4\") pod \"46ddec52-6876-4e01-975f-0c00387eba75\" (UID: \"46ddec52-6876-4e01-975f-0c00387eba75\") " Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.887497 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ddec52-6876-4e01-975f-0c00387eba75-kube-api-access-pg4m4" (OuterVolumeSpecName: "kube-api-access-pg4m4") pod "46ddec52-6876-4e01-975f-0c00387eba75" (UID: "46ddec52-6876-4e01-975f-0c00387eba75"). InnerVolumeSpecName "kube-api-access-pg4m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.951640 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g"] Mar 07 07:06:16 crc kubenswrapper[4941]: W0307 07:06:16.959688 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod524f2d31_e76d_4fdb_ad91_b6a4b85e3a4f.slice/crio-9fb726dce0acf5ffb1bbd9e35504d4f66259f7c8bbf3af22a3fa42d47417e162 WatchSource:0}: Error finding container 9fb726dce0acf5ffb1bbd9e35504d4f66259f7c8bbf3af22a3fa42d47417e162: Status 404 returned error can't find the container with id 9fb726dce0acf5ffb1bbd9e35504d4f66259f7c8bbf3af22a3fa42d47417e162 Mar 07 07:06:16 crc kubenswrapper[4941]: I0307 07:06:16.982989 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg4m4\" (UniqueName: \"kubernetes.io/projected/46ddec52-6876-4e01-975f-0c00387eba75-kube-api-access-pg4m4\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:17 crc kubenswrapper[4941]: I0307 07:06:17.479621 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" event={"ID":"46ddec52-6876-4e01-975f-0c00387eba75","Type":"ContainerDied","Data":"7d24bed0a12b0ce299c9961a52d4bdf835e7d08db0263ec978e6231c33c12b07"} Mar 07 07:06:17 crc kubenswrapper[4941]: I0307 07:06:17.480142 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d24bed0a12b0ce299c9961a52d4bdf835e7d08db0263ec978e6231c33c12b07" Mar 07 07:06:17 crc kubenswrapper[4941]: I0307 07:06:17.479708 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547786-dwbsp" Mar 07 07:06:17 crc kubenswrapper[4941]: I0307 07:06:17.481593 4941 generic.go:334] "Generic (PLEG): container finished" podID="524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f" containerID="15c85623d9aac7aa609ed0725be5565e022c37db7a161d500223b9412f7a5e9a" exitCode=0 Mar 07 07:06:17 crc kubenswrapper[4941]: I0307 07:06:17.481646 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" event={"ID":"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f","Type":"ContainerDied","Data":"15c85623d9aac7aa609ed0725be5565e022c37db7a161d500223b9412f7a5e9a"} Mar 07 07:06:17 crc kubenswrapper[4941]: I0307 07:06:17.481680 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" event={"ID":"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f","Type":"ContainerStarted","Data":"9fb726dce0acf5ffb1bbd9e35504d4f66259f7c8bbf3af22a3fa42d47417e162"} Mar 07 07:06:17 crc kubenswrapper[4941]: I0307 07:06:17.532973 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547780-42t8n"] Mar 07 07:06:17 crc kubenswrapper[4941]: I0307 07:06:17.535774 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547780-42t8n"] Mar 07 07:06:17 crc kubenswrapper[4941]: I0307 07:06:17.961326 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da69c66c-a6c7-439f-a328-6514f986d16b" path="/var/lib/kubelet/pods/da69c66c-a6c7-439f-a328-6514f986d16b/volumes" Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.705956 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hqrr9"] Mar 07 07:06:18 crc kubenswrapper[4941]: E0307 07:06:18.706743 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46ddec52-6876-4e01-975f-0c00387eba75" containerName="oc" Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.706762 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ddec52-6876-4e01-975f-0c00387eba75" containerName="oc" Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.706890 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="46ddec52-6876-4e01-975f-0c00387eba75" containerName="oc" Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.707920 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.712855 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqrr9"] Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.716834 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2wx\" (UniqueName: \"kubernetes.io/projected/ea12ce8c-0652-424c-8c82-88df7a7b71e8-kube-api-access-vr2wx\") pod \"redhat-operators-hqrr9\" (UID: \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\") " pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.716882 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea12ce8c-0652-424c-8c82-88df7a7b71e8-catalog-content\") pod \"redhat-operators-hqrr9\" (UID: \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\") " pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.716907 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea12ce8c-0652-424c-8c82-88df7a7b71e8-utilities\") pod \"redhat-operators-hqrr9\" (UID: \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\") " pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.817982 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2wx\" (UniqueName: \"kubernetes.io/projected/ea12ce8c-0652-424c-8c82-88df7a7b71e8-kube-api-access-vr2wx\") pod \"redhat-operators-hqrr9\" (UID: \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\") " pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.818353 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea12ce8c-0652-424c-8c82-88df7a7b71e8-catalog-content\") pod \"redhat-operators-hqrr9\" (UID: \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\") " pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.818426 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea12ce8c-0652-424c-8c82-88df7a7b71e8-utilities\") pod \"redhat-operators-hqrr9\" (UID: \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\") " pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.819048 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea12ce8c-0652-424c-8c82-88df7a7b71e8-utilities\") pod \"redhat-operators-hqrr9\" (UID: \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\") " pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.820239 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea12ce8c-0652-424c-8c82-88df7a7b71e8-catalog-content\") pod \"redhat-operators-hqrr9\" (UID: \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\") " pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:18 crc kubenswrapper[4941]: I0307 07:06:18.840510 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2wx\" (UniqueName: \"kubernetes.io/projected/ea12ce8c-0652-424c-8c82-88df7a7b71e8-kube-api-access-vr2wx\") pod \"redhat-operators-hqrr9\" (UID: \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\") " pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:19 crc kubenswrapper[4941]: I0307 07:06:19.032229 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:19 crc kubenswrapper[4941]: I0307 07:06:19.482699 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqrr9"] Mar 07 07:06:19 crc kubenswrapper[4941]: I0307 07:06:19.505705 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqrr9" event={"ID":"ea12ce8c-0652-424c-8c82-88df7a7b71e8","Type":"ContainerStarted","Data":"a103f5470f42ea4d7b10778a6ac273194a7bd2d7f1af4954f97f9ceea702a475"} Mar 07 07:06:19 crc kubenswrapper[4941]: I0307 07:06:19.510141 4941 generic.go:334] "Generic (PLEG): container finished" podID="524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f" containerID="754c785ba60be9b826c9ffe02766968648d4e1bfad31d43a9e5a75ec23aac73d" exitCode=0 Mar 07 07:06:19 crc kubenswrapper[4941]: I0307 07:06:19.510192 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" event={"ID":"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f","Type":"ContainerDied","Data":"754c785ba60be9b826c9ffe02766968648d4e1bfad31d43a9e5a75ec23aac73d"} Mar 07 07:06:20 crc kubenswrapper[4941]: I0307 07:06:20.519385 4941 generic.go:334] "Generic (PLEG): container finished" podID="524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f" containerID="8891c9cb213b3ecf3e0ae7d37e6d9c32e0a634856e938c6100e8b2b4cfa4c8b9" exitCode=0 Mar 07 07:06:20 crc kubenswrapper[4941]: I0307 07:06:20.520653 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" event={"ID":"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f","Type":"ContainerDied","Data":"8891c9cb213b3ecf3e0ae7d37e6d9c32e0a634856e938c6100e8b2b4cfa4c8b9"} Mar 07 07:06:20 crc kubenswrapper[4941]: I0307 07:06:20.522822 4941 generic.go:334] "Generic (PLEG): container finished" podID="ea12ce8c-0652-424c-8c82-88df7a7b71e8" containerID="ccacb76c3278693f9e692add29908163946696c6994ba4cb7587165f7daa1cb2" exitCode=0 Mar 07 07:06:20 crc kubenswrapper[4941]: I0307 07:06:20.522901 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqrr9" event={"ID":"ea12ce8c-0652-424c-8c82-88df7a7b71e8","Type":"ContainerDied","Data":"ccacb76c3278693f9e692add29908163946696c6994ba4cb7587165f7daa1cb2"} Mar 07 07:06:21 crc kubenswrapper[4941]: I0307 07:06:21.531777 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqrr9" event={"ID":"ea12ce8c-0652-424c-8c82-88df7a7b71e8","Type":"ContainerStarted","Data":"f8016e7f2961e167a7ce764c9ea66e04a8c44d87de8d36effcdbcddd8fc61ed4"} Mar 07 07:06:21 crc kubenswrapper[4941]: I0307 07:06:21.784804 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" Mar 07 07:06:21 crc kubenswrapper[4941]: I0307 07:06:21.957493 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-util\") pod \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\" (UID: \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\") " Mar 07 07:06:21 crc kubenswrapper[4941]: I0307 07:06:21.957678 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-bundle\") pod \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\" (UID: \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\") " Mar 07 07:06:21 crc kubenswrapper[4941]: I0307 07:06:21.957765 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dm6f\" (UniqueName: \"kubernetes.io/projected/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-kube-api-access-5dm6f\") pod \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\" (UID: \"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f\") " Mar 07 07:06:21 crc kubenswrapper[4941]: I0307 07:06:21.959504 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-bundle" (OuterVolumeSpecName: "bundle") pod "524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f" (UID: "524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:06:21 crc kubenswrapper[4941]: I0307 07:06:21.978582 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-kube-api-access-5dm6f" (OuterVolumeSpecName: "kube-api-access-5dm6f") pod "524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f" (UID: "524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f"). InnerVolumeSpecName "kube-api-access-5dm6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:21 crc kubenswrapper[4941]: I0307 07:06:21.987070 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-util" (OuterVolumeSpecName: "util") pod "524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f" (UID: "524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:06:22 crc kubenswrapper[4941]: I0307 07:06:22.060245 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dm6f\" (UniqueName: \"kubernetes.io/projected/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-kube-api-access-5dm6f\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:22 crc kubenswrapper[4941]: I0307 07:06:22.060281 4941 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-util\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:22 crc kubenswrapper[4941]: I0307 07:06:22.060293 4941 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:22 crc kubenswrapper[4941]: I0307 07:06:22.542755 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" event={"ID":"524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f","Type":"ContainerDied","Data":"9fb726dce0acf5ffb1bbd9e35504d4f66259f7c8bbf3af22a3fa42d47417e162"} Mar 07 07:06:22 crc kubenswrapper[4941]: I0307 07:06:22.542812 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fb726dce0acf5ffb1bbd9e35504d4f66259f7c8bbf3af22a3fa42d47417e162" Mar 07 07:06:22 crc kubenswrapper[4941]: I0307 07:06:22.542820 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g" Mar 07 07:06:22 crc kubenswrapper[4941]: I0307 07:06:22.547025 4941 generic.go:334] "Generic (PLEG): container finished" podID="ea12ce8c-0652-424c-8c82-88df7a7b71e8" containerID="f8016e7f2961e167a7ce764c9ea66e04a8c44d87de8d36effcdbcddd8fc61ed4" exitCode=0 Mar 07 07:06:22 crc kubenswrapper[4941]: I0307 07:06:22.547066 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqrr9" event={"ID":"ea12ce8c-0652-424c-8c82-88df7a7b71e8","Type":"ContainerDied","Data":"f8016e7f2961e167a7ce764c9ea66e04a8c44d87de8d36effcdbcddd8fc61ed4"} Mar 07 07:06:23 crc kubenswrapper[4941]: I0307 07:06:23.492916 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qsnwt" Mar 07 07:06:23 crc kubenswrapper[4941]: I0307 07:06:23.557517 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqrr9" event={"ID":"ea12ce8c-0652-424c-8c82-88df7a7b71e8","Type":"ContainerStarted","Data":"44d61cbdcca8f8e6459e4a3421921d5ef975e9234e346298cf4b519621ee5687"} Mar 07 07:06:23 crc kubenswrapper[4941]: I0307 07:06:23.580111 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hqrr9" podStartSLOduration=3.1451306629999998 podStartE2EDuration="5.580090043s" podCreationTimestamp="2026-03-07 07:06:18 +0000 UTC" firstStartedPulling="2026-03-07 07:06:20.5260419 +0000 UTC m=+877.478407405" lastFinishedPulling="2026-03-07 07:06:22.96100132 +0000 UTC m=+879.913366785" observedRunningTime="2026-03-07 07:06:23.579037927 +0000 UTC m=+880.531403412" watchObservedRunningTime="2026-03-07 07:06:23.580090043 +0000 UTC m=+880.532455518" Mar 07 07:06:26 crc kubenswrapper[4941]: I0307 07:06:26.823559 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-77mq6"] Mar 07 07:06:26 crc kubenswrapper[4941]: E0307 07:06:26.824200 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f" containerName="pull" Mar 07 07:06:26 crc kubenswrapper[4941]: I0307 07:06:26.824216 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f" containerName="pull" Mar 07 07:06:26 crc kubenswrapper[4941]: E0307 07:06:26.824231 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f" containerName="extract" Mar 07 07:06:26 crc kubenswrapper[4941]: I0307 07:06:26.824240 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f" containerName="extract" Mar 07 07:06:26 crc kubenswrapper[4941]: E0307 07:06:26.824251 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f" containerName="util" Mar 07 07:06:26 crc kubenswrapper[4941]: I0307 07:06:26.824260 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f" containerName="util" Mar 07 07:06:26 crc kubenswrapper[4941]: I0307 07:06:26.824394 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f" containerName="extract" Mar 07 07:06:26 crc kubenswrapper[4941]: I0307 07:06:26.824981 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-77mq6" Mar 07 07:06:26 crc kubenswrapper[4941]: I0307 07:06:26.827372 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-vw99b" Mar 07 07:06:26 crc kubenswrapper[4941]: I0307 07:06:26.827393 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 07 07:06:26 crc kubenswrapper[4941]: I0307 07:06:26.829360 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 07 07:06:26 crc kubenswrapper[4941]: I0307 07:06:26.833719 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-77mq6"] Mar 07 07:06:26 crc kubenswrapper[4941]: I0307 07:06:26.926530 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-599rm\" (UniqueName: \"kubernetes.io/projected/fa230b05-96f6-4c46-9f79-11ab1d72e453-kube-api-access-599rm\") pod \"nmstate-operator-75c5dccd6c-77mq6\" (UID: \"fa230b05-96f6-4c46-9f79-11ab1d72e453\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-77mq6" Mar 07 07:06:27 crc kubenswrapper[4941]: I0307 07:06:27.027816 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-599rm\" (UniqueName: \"kubernetes.io/projected/fa230b05-96f6-4c46-9f79-11ab1d72e453-kube-api-access-599rm\") pod \"nmstate-operator-75c5dccd6c-77mq6\" (UID: \"fa230b05-96f6-4c46-9f79-11ab1d72e453\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-77mq6" Mar 07 07:06:27 crc kubenswrapper[4941]: I0307 07:06:27.053471 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-599rm\" (UniqueName: \"kubernetes.io/projected/fa230b05-96f6-4c46-9f79-11ab1d72e453-kube-api-access-599rm\") pod \"nmstate-operator-75c5dccd6c-77mq6\" (UID: \"fa230b05-96f6-4c46-9f79-11ab1d72e453\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-77mq6" Mar 07 07:06:27 crc kubenswrapper[4941]: I0307 07:06:27.144325 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-77mq6" Mar 07 07:06:27 crc kubenswrapper[4941]: I0307 07:06:27.346751 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-77mq6"] Mar 07 07:06:27 crc kubenswrapper[4941]: I0307 07:06:27.584488 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-77mq6" event={"ID":"fa230b05-96f6-4c46-9f79-11ab1d72e453","Type":"ContainerStarted","Data":"04d8b15a2bec4682baf387230d6b1d8a191a970e9af1f43ecb41efe9d0c1f656"} Mar 07 07:06:29 crc kubenswrapper[4941]: I0307 07:06:29.033316 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:29 crc kubenswrapper[4941]: I0307 07:06:29.033937 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:30 crc kubenswrapper[4941]: I0307 07:06:30.081499 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hqrr9" podUID="ea12ce8c-0652-424c-8c82-88df7a7b71e8" containerName="registry-server" probeResult="failure" output=< Mar 07 07:06:30 crc kubenswrapper[4941]: timeout: failed to connect service ":50051" within 1s Mar 07 07:06:30 crc kubenswrapper[4941]: > Mar 07 07:06:31 crc kubenswrapper[4941]: I0307 07:06:31.613141 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-77mq6" event={"ID":"fa230b05-96f6-4c46-9f79-11ab1d72e453","Type":"ContainerStarted","Data":"fee2309f232afd42c70af6bb3bdb7771a6a959b7fc2a0a9a273d675eefd185d4"} Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.442514 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-77mq6" podStartSLOduration=6.448404952 podStartE2EDuration="10.442494357s" podCreationTimestamp="2026-03-07 07:06:26 +0000 UTC" firstStartedPulling="2026-03-07 07:06:27.357021214 +0000 UTC m=+884.309386679" lastFinishedPulling="2026-03-07 07:06:31.351110619 +0000 UTC m=+888.303476084" observedRunningTime="2026-03-07 07:06:31.631768491 +0000 UTC m=+888.584133956" watchObservedRunningTime="2026-03-07 07:06:36.442494357 +0000 UTC m=+893.394859822" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.445567 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-fznph"] Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.446552 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-fznph" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.449631 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-rm8l8" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.456739 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw"] Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.457646 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.459576 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.464839 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a359df50-440b-47ae-a48d-2ab93b52db74-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-4w5nw\" (UID: \"a359df50-440b-47ae-a48d-2ab93b52db74\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.464954 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwg98\" (UniqueName: \"kubernetes.io/projected/a359df50-440b-47ae-a48d-2ab93b52db74-kube-api-access-pwg98\") pod \"nmstate-webhook-786f45cff4-4w5nw\" (UID: \"a359df50-440b-47ae-a48d-2ab93b52db74\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.465031 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8l8m\" (UniqueName: \"kubernetes.io/projected/9d22f709-5c4f-4809-8de1-515f401502fe-kube-api-access-t8l8m\") pod \"nmstate-metrics-69594cc75-fznph\" (UID: \"9d22f709-5c4f-4809-8de1-515f401502fe\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-fznph" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.473890 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-fznph"] Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.484978 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-b4gr5"] Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.485930 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.508655 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw"] Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.566667 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a359df50-440b-47ae-a48d-2ab93b52db74-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-4w5nw\" (UID: \"a359df50-440b-47ae-a48d-2ab93b52db74\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.566726 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwg98\" (UniqueName: \"kubernetes.io/projected/a359df50-440b-47ae-a48d-2ab93b52db74-kube-api-access-pwg98\") pod \"nmstate-webhook-786f45cff4-4w5nw\" (UID: \"a359df50-440b-47ae-a48d-2ab93b52db74\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.566758 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8l8m\" (UniqueName: \"kubernetes.io/projected/9d22f709-5c4f-4809-8de1-515f401502fe-kube-api-access-t8l8m\") pod \"nmstate-metrics-69594cc75-fznph\" (UID: \"9d22f709-5c4f-4809-8de1-515f401502fe\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-fznph" Mar 07 07:06:36 crc kubenswrapper[4941]: E0307 07:06:36.567115 4941 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 07 07:06:36 crc kubenswrapper[4941]: E0307 07:06:36.567168 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a359df50-440b-47ae-a48d-2ab93b52db74-tls-key-pair podName:a359df50-440b-47ae-a48d-2ab93b52db74 nodeName:}" failed. No retries permitted until 2026-03-07 07:06:37.067150187 +0000 UTC m=+894.019515652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/a359df50-440b-47ae-a48d-2ab93b52db74-tls-key-pair") pod "nmstate-webhook-786f45cff4-4w5nw" (UID: "a359df50-440b-47ae-a48d-2ab93b52db74") : secret "openshift-nmstate-webhook" not found Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.605961 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8l8m\" (UniqueName: \"kubernetes.io/projected/9d22f709-5c4f-4809-8de1-515f401502fe-kube-api-access-t8l8m\") pod \"nmstate-metrics-69594cc75-fznph\" (UID: \"9d22f709-5c4f-4809-8de1-515f401502fe\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-fznph" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.621733 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwg98\" (UniqueName: \"kubernetes.io/projected/a359df50-440b-47ae-a48d-2ab93b52db74-kube-api-access-pwg98\") pod \"nmstate-webhook-786f45cff4-4w5nw\" (UID: \"a359df50-440b-47ae-a48d-2ab93b52db74\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.660282 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8"] Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.661024 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.667222 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.667506 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-xbh8f" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.667706 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/52b501d5-7ace-4d53-9102-fa7ed37df581-ovs-socket\") pod \"nmstate-handler-b4gr5\" (UID: \"52b501d5-7ace-4d53-9102-fa7ed37df581\") " pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.667763 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/52b501d5-7ace-4d53-9102-fa7ed37df581-dbus-socket\") pod \"nmstate-handler-b4gr5\" (UID: \"52b501d5-7ace-4d53-9102-fa7ed37df581\") " pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.667820 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/52b501d5-7ace-4d53-9102-fa7ed37df581-nmstate-lock\") pod \"nmstate-handler-b4gr5\" (UID: \"52b501d5-7ace-4d53-9102-fa7ed37df581\") " pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.667879 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chgjz\" (UniqueName: \"kubernetes.io/projected/52b501d5-7ace-4d53-9102-fa7ed37df581-kube-api-access-chgjz\") pod \"nmstate-handler-b4gr5\" (UID: \"52b501d5-7ace-4d53-9102-fa7ed37df581\") " pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.671653 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.680302 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8"] Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.769755 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj7qm\" (UniqueName: \"kubernetes.io/projected/ddd11f70-fc7d-478d-8036-62c895c6fb60-kube-api-access-vj7qm\") pod \"nmstate-console-plugin-5dcbbd79cf-vd5f8\" (UID: \"ddd11f70-fc7d-478d-8036-62c895c6fb60\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.769849 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/52b501d5-7ace-4d53-9102-fa7ed37df581-ovs-socket\") pod \"nmstate-handler-b4gr5\" (UID: \"52b501d5-7ace-4d53-9102-fa7ed37df581\") " pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.769874 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/52b501d5-7ace-4d53-9102-fa7ed37df581-dbus-socket\") pod \"nmstate-handler-b4gr5\" (UID: \"52b501d5-7ace-4d53-9102-fa7ed37df581\") " pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.769903 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/52b501d5-7ace-4d53-9102-fa7ed37df581-nmstate-lock\") pod \"nmstate-handler-b4gr5\" (UID: \"52b501d5-7ace-4d53-9102-fa7ed37df581\") " pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.769943 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chgjz\" (UniqueName: \"kubernetes.io/projected/52b501d5-7ace-4d53-9102-fa7ed37df581-kube-api-access-chgjz\") pod \"nmstate-handler-b4gr5\" (UID: \"52b501d5-7ace-4d53-9102-fa7ed37df581\") " pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.769956 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/52b501d5-7ace-4d53-9102-fa7ed37df581-ovs-socket\") pod \"nmstate-handler-b4gr5\" (UID: \"52b501d5-7ace-4d53-9102-fa7ed37df581\") " pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.769982 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ddd11f70-fc7d-478d-8036-62c895c6fb60-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-vd5f8\" (UID: \"ddd11f70-fc7d-478d-8036-62c895c6fb60\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.770041 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/52b501d5-7ace-4d53-9102-fa7ed37df581-nmstate-lock\") pod \"nmstate-handler-b4gr5\" (UID: \"52b501d5-7ace-4d53-9102-fa7ed37df581\") " pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.770052 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd11f70-fc7d-478d-8036-62c895c6fb60-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-vd5f8\" (UID: \"ddd11f70-fc7d-478d-8036-62c895c6fb60\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.770222 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-fznph" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.770320 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/52b501d5-7ace-4d53-9102-fa7ed37df581-dbus-socket\") pod \"nmstate-handler-b4gr5\" (UID: \"52b501d5-7ace-4d53-9102-fa7ed37df581\") " pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.791488 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chgjz\" (UniqueName: \"kubernetes.io/projected/52b501d5-7ace-4d53-9102-fa7ed37df581-kube-api-access-chgjz\") pod \"nmstate-handler-b4gr5\" (UID: \"52b501d5-7ace-4d53-9102-fa7ed37df581\") " pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.812329 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.831318 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-789c7b5974-22gxt"] Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.834444 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.851885 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-789c7b5974-22gxt"] Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.872088 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ddd11f70-fc7d-478d-8036-62c895c6fb60-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-vd5f8\" (UID: \"ddd11f70-fc7d-478d-8036-62c895c6fb60\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.872142 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd11f70-fc7d-478d-8036-62c895c6fb60-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-vd5f8\" (UID: \"ddd11f70-fc7d-478d-8036-62c895c6fb60\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.872187 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj7qm\" (UniqueName: \"kubernetes.io/projected/ddd11f70-fc7d-478d-8036-62c895c6fb60-kube-api-access-vj7qm\") pod \"nmstate-console-plugin-5dcbbd79cf-vd5f8\" (UID: \"ddd11f70-fc7d-478d-8036-62c895c6fb60\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.873293 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ddd11f70-fc7d-478d-8036-62c895c6fb60-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-vd5f8\" (UID: \"ddd11f70-fc7d-478d-8036-62c895c6fb60\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.878301 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd11f70-fc7d-478d-8036-62c895c6fb60-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-vd5f8\" (UID: \"ddd11f70-fc7d-478d-8036-62c895c6fb60\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.895149 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj7qm\" (UniqueName: \"kubernetes.io/projected/ddd11f70-fc7d-478d-8036-62c895c6fb60-kube-api-access-vj7qm\") pod \"nmstate-console-plugin-5dcbbd79cf-vd5f8\" (UID: \"ddd11f70-fc7d-478d-8036-62c895c6fb60\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.974175 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff6949bd-f40c-4bf3-bfb2-843db468bc49-console-config\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.974224 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff6949bd-f40c-4bf3-bfb2-843db468bc49-trusted-ca-bundle\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.974246 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff6949bd-f40c-4bf3-bfb2-843db468bc49-console-oauth-config\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.974283 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwnjw\" (UniqueName: \"kubernetes.io/projected/ff6949bd-f40c-4bf3-bfb2-843db468bc49-kube-api-access-hwnjw\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.976773 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff6949bd-f40c-4bf3-bfb2-843db468bc49-service-ca\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.976818 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff6949bd-f40c-4bf3-bfb2-843db468bc49-console-serving-cert\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.976896 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff6949bd-f40c-4bf3-bfb2-843db468bc49-oauth-serving-cert\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:36 crc kubenswrapper[4941]: I0307 07:06:36.985226 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.078523 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff6949bd-f40c-4bf3-bfb2-843db468bc49-oauth-serving-cert\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.078581 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff6949bd-f40c-4bf3-bfb2-843db468bc49-console-config\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.078619 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff6949bd-f40c-4bf3-bfb2-843db468bc49-trusted-ca-bundle\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.078646 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff6949bd-f40c-4bf3-bfb2-843db468bc49-console-oauth-config\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.078664 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwnjw\" (UniqueName: \"kubernetes.io/projected/ff6949bd-f40c-4bf3-bfb2-843db468bc49-kube-api-access-hwnjw\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.078684 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff6949bd-f40c-4bf3-bfb2-843db468bc49-service-ca\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.078715 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff6949bd-f40c-4bf3-bfb2-843db468bc49-console-serving-cert\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.078749 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a359df50-440b-47ae-a48d-2ab93b52db74-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-4w5nw\" (UID: \"a359df50-440b-47ae-a48d-2ab93b52db74\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.079599 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff6949bd-f40c-4bf3-bfb2-843db468bc49-oauth-serving-cert\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.079803 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff6949bd-f40c-4bf3-bfb2-843db468bc49-service-ca\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.080024 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff6949bd-f40c-4bf3-bfb2-843db468bc49-trusted-ca-bundle\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.080506 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff6949bd-f40c-4bf3-bfb2-843db468bc49-console-config\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.083806 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff6949bd-f40c-4bf3-bfb2-843db468bc49-console-oauth-config\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.085991 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff6949bd-f40c-4bf3-bfb2-843db468bc49-console-serving-cert\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.087181 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a359df50-440b-47ae-a48d-2ab93b52db74-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-4w5nw\" (UID: \"a359df50-440b-47ae-a48d-2ab93b52db74\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.101892 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwnjw\" (UniqueName: \"kubernetes.io/projected/ff6949bd-f40c-4bf3-bfb2-843db468bc49-kube-api-access-hwnjw\") pod \"console-789c7b5974-22gxt\" (UID: \"ff6949bd-f40c-4bf3-bfb2-843db468bc49\") " pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.102321 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.161804 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8"] Mar 07 07:06:37 crc kubenswrapper[4941]: W0307 07:06:37.164800 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddd11f70_fc7d_478d_8036_62c895c6fb60.slice/crio-701ed633d30b849166786f4ac6e38cc2667abf6039ca3cfdd5da7245b792663d WatchSource:0}: Error finding container 701ed633d30b849166786f4ac6e38cc2667abf6039ca3cfdd5da7245b792663d: Status 404 returned error can't find the container with id 701ed633d30b849166786f4ac6e38cc2667abf6039ca3cfdd5da7245b792663d Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.237347 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.283499 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-fznph"] Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.367655 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw"] Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.450521 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-789c7b5974-22gxt"] Mar 07 07:06:37 crc kubenswrapper[4941]: W0307 07:06:37.458265 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff6949bd_f40c_4bf3_bfb2_843db468bc49.slice/crio-75ea291ae90b7f6a195ea9456483368b91e81678deeb0590bf5ca4d207d658d2 WatchSource:0}: Error finding container 75ea291ae90b7f6a195ea9456483368b91e81678deeb0590bf5ca4d207d658d2: Status 404 returned error can't find the container with id 75ea291ae90b7f6a195ea9456483368b91e81678deeb0590bf5ca4d207d658d2 Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.691737 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" event={"ID":"ddd11f70-fc7d-478d-8036-62c895c6fb60","Type":"ContainerStarted","Data":"701ed633d30b849166786f4ac6e38cc2667abf6039ca3cfdd5da7245b792663d"} Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.693328 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" event={"ID":"a359df50-440b-47ae-a48d-2ab93b52db74","Type":"ContainerStarted","Data":"d87d9713659c9605d2d700a05309cf9643be857be6c5854866fbe6632604fead"} Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.695069 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-789c7b5974-22gxt" event={"ID":"ff6949bd-f40c-4bf3-bfb2-843db468bc49","Type":"ContainerStarted","Data":"37b0504bd120707eb981703e2cc967939ad244caee0d7f0951695f29bec59764"} Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.695092 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-789c7b5974-22gxt" event={"ID":"ff6949bd-f40c-4bf3-bfb2-843db468bc49","Type":"ContainerStarted","Data":"75ea291ae90b7f6a195ea9456483368b91e81678deeb0590bf5ca4d207d658d2"} Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.696598 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-fznph" event={"ID":"9d22f709-5c4f-4809-8de1-515f401502fe","Type":"ContainerStarted","Data":"ec2eecbccdea0b41bb105dbae14f2eb116012f847fbe0810a6c8e4d74e111fca"} Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.698005 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b4gr5" event={"ID":"52b501d5-7ace-4d53-9102-fa7ed37df581","Type":"ContainerStarted","Data":"0947466db63d7cb1b69d5aa8f685487bd886496b5ffeeb0276d9430afe4934ee"} Mar 07 07:06:37 crc kubenswrapper[4941]: I0307 07:06:37.710886 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-789c7b5974-22gxt" podStartSLOduration=1.710868301 podStartE2EDuration="1.710868301s" podCreationTimestamp="2026-03-07 07:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:06:37.710789489 +0000 UTC m=+894.663154954" watchObservedRunningTime="2026-03-07 07:06:37.710868301 +0000 UTC m=+894.663233756" Mar 07 07:06:39 crc kubenswrapper[4941]: I0307 07:06:39.070272 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:39 crc kubenswrapper[4941]: I0307 07:06:39.120937 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:39 crc kubenswrapper[4941]: I0307 07:06:39.299066 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hqrr9"] Mar 07 07:06:40 crc kubenswrapper[4941]: I0307 07:06:40.720205 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-fznph" event={"ID":"9d22f709-5c4f-4809-8de1-515f401502fe","Type":"ContainerStarted","Data":"7c83c535f1251b23cc9b4db27d9f8ae41b8a8ca498c072b4323df11dade08c1a"} Mar 07 07:06:40 crc kubenswrapper[4941]: I0307 07:06:40.723813 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b4gr5" event={"ID":"52b501d5-7ace-4d53-9102-fa7ed37df581","Type":"ContainerStarted","Data":"670fc37d12964c1117b7f8f34aab3714cd8ef6fbb1728f20d92372ad0301affa"} Mar 07 07:06:40 crc kubenswrapper[4941]: I0307 07:06:40.724003 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:40 crc kubenswrapper[4941]: I0307 07:06:40.725565 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" event={"ID":"ddd11f70-fc7d-478d-8036-62c895c6fb60","Type":"ContainerStarted","Data":"2657c768560ffa72c854fbe6fe57c6b8a2c90944e1cad6fb7b532f23fc450de3"} Mar 07 07:06:40 crc kubenswrapper[4941]: I0307 07:06:40.728273 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" event={"ID":"a359df50-440b-47ae-a48d-2ab93b52db74","Type":"ContainerStarted","Data":"6314cf5ea9a82c1def0ae5bcb9e9cdc4d1bb479d1714d89afd87cba1fe045aac"} Mar 07 07:06:40 crc kubenswrapper[4941]: I0307 07:06:40.728475 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" Mar 07 07:06:40 crc kubenswrapper[4941]: I0307 07:06:40.728886 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hqrr9" podUID="ea12ce8c-0652-424c-8c82-88df7a7b71e8" containerName="registry-server" containerID="cri-o://44d61cbdcca8f8e6459e4a3421921d5ef975e9234e346298cf4b519621ee5687" gracePeriod=2 Mar 07 07:06:40 crc kubenswrapper[4941]: I0307 07:06:40.740604 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-b4gr5" podStartSLOduration=1.775360163 podStartE2EDuration="4.740563973s" podCreationTimestamp="2026-03-07 07:06:36 +0000 UTC" firstStartedPulling="2026-03-07 07:06:36.879326478 +0000 UTC m=+893.831691943" lastFinishedPulling="2026-03-07 07:06:39.844530278 +0000 UTC m=+896.796895753" observedRunningTime="2026-03-07 07:06:40.739003014 +0000 UTC m=+897.691368479" watchObservedRunningTime="2026-03-07 07:06:40.740563973 +0000 UTC m=+897.692929438" Mar 07 07:06:40 crc kubenswrapper[4941]: I0307 07:06:40.763904 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vd5f8" podStartSLOduration=2.08368457 podStartE2EDuration="4.763879549s" podCreationTimestamp="2026-03-07 07:06:36 +0000 UTC" firstStartedPulling="2026-03-07 07:06:37.167313373 +0000 UTC m=+894.119678838" lastFinishedPulling="2026-03-07 07:06:39.847508362 +0000 UTC m=+896.799873817" observedRunningTime="2026-03-07 07:06:40.755853931 +0000 UTC m=+897.708219406" watchObservedRunningTime="2026-03-07 07:06:40.763879549 +0000 UTC m=+897.716245014" Mar 07 07:06:40 crc kubenswrapper[4941]: I0307 07:06:40.787327 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" podStartSLOduration=2.314046541 podStartE2EDuration="4.787298897s" podCreationTimestamp="2026-03-07 07:06:36 +0000 UTC" firstStartedPulling="2026-03-07 07:06:37.371709083 +0000 UTC m=+894.324074548" lastFinishedPulling="2026-03-07 07:06:39.844961429 +0000 UTC m=+896.797326904" observedRunningTime="2026-03-07 07:06:40.780827797 +0000 UTC m=+897.733193262" watchObservedRunningTime="2026-03-07 07:06:40.787298897 +0000 UTC m=+897.739664362" Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.101169 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.288246 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr2wx\" (UniqueName: \"kubernetes.io/projected/ea12ce8c-0652-424c-8c82-88df7a7b71e8-kube-api-access-vr2wx\") pod \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\" (UID: \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\") " Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.288758 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea12ce8c-0652-424c-8c82-88df7a7b71e8-utilities\") pod \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\" (UID: \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\") " Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.288832 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea12ce8c-0652-424c-8c82-88df7a7b71e8-catalog-content\") pod \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\" (UID: \"ea12ce8c-0652-424c-8c82-88df7a7b71e8\") " Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.289673 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea12ce8c-0652-424c-8c82-88df7a7b71e8-utilities" (OuterVolumeSpecName: "utilities") pod "ea12ce8c-0652-424c-8c82-88df7a7b71e8" (UID: "ea12ce8c-0652-424c-8c82-88df7a7b71e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.297040 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea12ce8c-0652-424c-8c82-88df7a7b71e8-kube-api-access-vr2wx" (OuterVolumeSpecName: "kube-api-access-vr2wx") pod "ea12ce8c-0652-424c-8c82-88df7a7b71e8" (UID: "ea12ce8c-0652-424c-8c82-88df7a7b71e8"). InnerVolumeSpecName "kube-api-access-vr2wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.390758 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr2wx\" (UniqueName: \"kubernetes.io/projected/ea12ce8c-0652-424c-8c82-88df7a7b71e8-kube-api-access-vr2wx\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.390827 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea12ce8c-0652-424c-8c82-88df7a7b71e8-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.446658 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea12ce8c-0652-424c-8c82-88df7a7b71e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea12ce8c-0652-424c-8c82-88df7a7b71e8" (UID: "ea12ce8c-0652-424c-8c82-88df7a7b71e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.492169 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea12ce8c-0652-424c-8c82-88df7a7b71e8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.740396 4941 generic.go:334] "Generic (PLEG): container finished" podID="ea12ce8c-0652-424c-8c82-88df7a7b71e8" containerID="44d61cbdcca8f8e6459e4a3421921d5ef975e9234e346298cf4b519621ee5687" exitCode=0 Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.740473 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqrr9" Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.740537 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqrr9" event={"ID":"ea12ce8c-0652-424c-8c82-88df7a7b71e8","Type":"ContainerDied","Data":"44d61cbdcca8f8e6459e4a3421921d5ef975e9234e346298cf4b519621ee5687"} Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.740578 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqrr9" event={"ID":"ea12ce8c-0652-424c-8c82-88df7a7b71e8","Type":"ContainerDied","Data":"a103f5470f42ea4d7b10778a6ac273194a7bd2d7f1af4954f97f9ceea702a475"} Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.740602 4941 scope.go:117] "RemoveContainer" containerID="44d61cbdcca8f8e6459e4a3421921d5ef975e9234e346298cf4b519621ee5687" Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.769733 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hqrr9"] Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.773069 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hqrr9"] Mar 07 07:06:41 crc kubenswrapper[4941]: I0307 07:06:41.964135 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea12ce8c-0652-424c-8c82-88df7a7b71e8" path="/var/lib/kubelet/pods/ea12ce8c-0652-424c-8c82-88df7a7b71e8/volumes" Mar 07 07:06:42 crc kubenswrapper[4941]: I0307 07:06:42.117199 4941 scope.go:117] "RemoveContainer" containerID="f8016e7f2961e167a7ce764c9ea66e04a8c44d87de8d36effcdbcddd8fc61ed4" Mar 07 07:06:42 crc kubenswrapper[4941]: I0307 07:06:42.159135 4941 scope.go:117] "RemoveContainer" containerID="ccacb76c3278693f9e692add29908163946696c6994ba4cb7587165f7daa1cb2" Mar 07 07:06:42 crc kubenswrapper[4941]: I0307 07:06:42.174110 4941 scope.go:117] "RemoveContainer" containerID="44d61cbdcca8f8e6459e4a3421921d5ef975e9234e346298cf4b519621ee5687" Mar 07 07:06:42 crc kubenswrapper[4941]: E0307 07:06:42.174815 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44d61cbdcca8f8e6459e4a3421921d5ef975e9234e346298cf4b519621ee5687\": container with ID starting with 44d61cbdcca8f8e6459e4a3421921d5ef975e9234e346298cf4b519621ee5687 not found: ID does not exist" containerID="44d61cbdcca8f8e6459e4a3421921d5ef975e9234e346298cf4b519621ee5687" Mar 07 07:06:42 crc kubenswrapper[4941]: I0307 07:06:42.174889 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d61cbdcca8f8e6459e4a3421921d5ef975e9234e346298cf4b519621ee5687"} err="failed to get container status \"44d61cbdcca8f8e6459e4a3421921d5ef975e9234e346298cf4b519621ee5687\": rpc error: code = NotFound desc = could not find container \"44d61cbdcca8f8e6459e4a3421921d5ef975e9234e346298cf4b519621ee5687\": container with ID starting with 44d61cbdcca8f8e6459e4a3421921d5ef975e9234e346298cf4b519621ee5687 not found: ID does not exist" Mar 07 07:06:42 crc kubenswrapper[4941]: I0307 07:06:42.174932 4941 scope.go:117] "RemoveContainer" containerID="f8016e7f2961e167a7ce764c9ea66e04a8c44d87de8d36effcdbcddd8fc61ed4" Mar 07 07:06:42 crc kubenswrapper[4941]: E0307 07:06:42.175874 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8016e7f2961e167a7ce764c9ea66e04a8c44d87de8d36effcdbcddd8fc61ed4\": container with ID starting with f8016e7f2961e167a7ce764c9ea66e04a8c44d87de8d36effcdbcddd8fc61ed4 not found: ID does not exist" containerID="f8016e7f2961e167a7ce764c9ea66e04a8c44d87de8d36effcdbcddd8fc61ed4" Mar 07 07:06:42 crc kubenswrapper[4941]: I0307 07:06:42.175905 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8016e7f2961e167a7ce764c9ea66e04a8c44d87de8d36effcdbcddd8fc61ed4"} err="failed to get container status \"f8016e7f2961e167a7ce764c9ea66e04a8c44d87de8d36effcdbcddd8fc61ed4\": rpc error: code = NotFound desc = could not find container \"f8016e7f2961e167a7ce764c9ea66e04a8c44d87de8d36effcdbcddd8fc61ed4\": container with ID starting with f8016e7f2961e167a7ce764c9ea66e04a8c44d87de8d36effcdbcddd8fc61ed4 not found: ID does not exist" Mar 07 07:06:42 crc kubenswrapper[4941]: I0307 07:06:42.175921 4941 scope.go:117] "RemoveContainer" containerID="ccacb76c3278693f9e692add29908163946696c6994ba4cb7587165f7daa1cb2" Mar 07 07:06:42 crc kubenswrapper[4941]: E0307 07:06:42.176366 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccacb76c3278693f9e692add29908163946696c6994ba4cb7587165f7daa1cb2\": container with ID starting with ccacb76c3278693f9e692add29908163946696c6994ba4cb7587165f7daa1cb2 not found: ID does not exist" containerID="ccacb76c3278693f9e692add29908163946696c6994ba4cb7587165f7daa1cb2" Mar 07 07:06:42 crc kubenswrapper[4941]: I0307 07:06:42.176465 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccacb76c3278693f9e692add29908163946696c6994ba4cb7587165f7daa1cb2"} err="failed to get container status \"ccacb76c3278693f9e692add29908163946696c6994ba4cb7587165f7daa1cb2\": rpc error: code = NotFound desc = could not find container \"ccacb76c3278693f9e692add29908163946696c6994ba4cb7587165f7daa1cb2\": container with ID starting with ccacb76c3278693f9e692add29908163946696c6994ba4cb7587165f7daa1cb2 not found: ID does not exist" Mar 07 07:06:42 crc kubenswrapper[4941]: I0307 07:06:42.753423 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-fznph" event={"ID":"9d22f709-5c4f-4809-8de1-515f401502fe","Type":"ContainerStarted","Data":"61097ad4c6aa19b7829ae8cf4ab1e6c47facbfd39f8fe0b7b08e7e09eb5d168b"} Mar 07 07:06:42 crc kubenswrapper[4941]: I0307 07:06:42.775308 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-fznph" podStartSLOduration=1.898584998 podStartE2EDuration="6.775289006s" podCreationTimestamp="2026-03-07 07:06:36 +0000 UTC" firstStartedPulling="2026-03-07 07:06:37.298508744 +0000 UTC m=+894.250874199" lastFinishedPulling="2026-03-07 07:06:42.175212752 +0000 UTC m=+899.127578207" observedRunningTime="2026-03-07 07:06:42.774320112 +0000 UTC m=+899.726685607" watchObservedRunningTime="2026-03-07 07:06:42.775289006 +0000 UTC m=+899.727654471" Mar 07 07:06:46 crc kubenswrapper[4941]: I0307 07:06:46.851147 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-b4gr5" Mar 07 07:06:47 crc kubenswrapper[4941]: I0307 07:06:47.237899 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:47 crc kubenswrapper[4941]: I0307 07:06:47.237990 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:47 crc kubenswrapper[4941]: I0307 07:06:47.245942 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:47 crc kubenswrapper[4941]: I0307 07:06:47.797538 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-789c7b5974-22gxt" Mar 07 07:06:47 crc kubenswrapper[4941]: I0307 07:06:47.870593 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nwzjs"] Mar 07 07:06:57 crc kubenswrapper[4941]: I0307 07:06:57.111705 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4w5nw" Mar 07 07:07:02 crc kubenswrapper[4941]: I0307 07:07:02.558360 4941 scope.go:117] "RemoveContainer" containerID="d3ee4a945633e6b0a3254728184548cc71eea0e37bf7eb74506c8b423cbb8379" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.658482 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j"] Mar 07 07:07:10 crc kubenswrapper[4941]: E0307 07:07:10.659738 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea12ce8c-0652-424c-8c82-88df7a7b71e8" containerName="registry-server" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.659759 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea12ce8c-0652-424c-8c82-88df7a7b71e8" containerName="registry-server" Mar 07 07:07:10 crc kubenswrapper[4941]: E0307 07:07:10.659782 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea12ce8c-0652-424c-8c82-88df7a7b71e8" containerName="extract-utilities" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.659793 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea12ce8c-0652-424c-8c82-88df7a7b71e8" containerName="extract-utilities" Mar 07 07:07:10 crc kubenswrapper[4941]: E0307 07:07:10.659818 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea12ce8c-0652-424c-8c82-88df7a7b71e8" containerName="extract-content" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.659831 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea12ce8c-0652-424c-8c82-88df7a7b71e8" containerName="extract-content" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.659980 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea12ce8c-0652-424c-8c82-88df7a7b71e8" containerName="registry-server" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.661262 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.663209 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.664379 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j"] Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.838050 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmh5s\" (UniqueName: \"kubernetes.io/projected/bd5d5308-944e-4c42-a452-049f94c4d06b-kube-api-access-dmh5s\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j\" (UID: \"bd5d5308-944e-4c42-a452-049f94c4d06b\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.838109 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd5d5308-944e-4c42-a452-049f94c4d06b-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j\" (UID: \"bd5d5308-944e-4c42-a452-049f94c4d06b\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.838147 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd5d5308-944e-4c42-a452-049f94c4d06b-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j\" (UID: \"bd5d5308-944e-4c42-a452-049f94c4d06b\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.938991 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmh5s\" (UniqueName: \"kubernetes.io/projected/bd5d5308-944e-4c42-a452-049f94c4d06b-kube-api-access-dmh5s\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j\" (UID: \"bd5d5308-944e-4c42-a452-049f94c4d06b\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.939061 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd5d5308-944e-4c42-a452-049f94c4d06b-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j\" (UID: \"bd5d5308-944e-4c42-a452-049f94c4d06b\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.939098 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd5d5308-944e-4c42-a452-049f94c4d06b-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j\" (UID: \"bd5d5308-944e-4c42-a452-049f94c4d06b\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.939731 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd5d5308-944e-4c42-a452-049f94c4d06b-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j\" (UID: \"bd5d5308-944e-4c42-a452-049f94c4d06b\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.939739 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd5d5308-944e-4c42-a452-049f94c4d06b-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j\" (UID: \"bd5d5308-944e-4c42-a452-049f94c4d06b\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.965813 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmh5s\" (UniqueName: \"kubernetes.io/projected/bd5d5308-944e-4c42-a452-049f94c4d06b-kube-api-access-dmh5s\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j\" (UID: \"bd5d5308-944e-4c42-a452-049f94c4d06b\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" Mar 07 07:07:10 crc kubenswrapper[4941]: I0307 07:07:10.979598 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" Mar 07 07:07:11 crc kubenswrapper[4941]: I0307 07:07:11.409973 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j"] Mar 07 07:07:11 crc kubenswrapper[4941]: W0307 07:07:11.425895 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd5d5308_944e_4c42_a452_049f94c4d06b.slice/crio-5ccd868d0ec54bbc22e013bc3b35ccea5c2e9ed885483545836dc85025757f3c WatchSource:0}: Error finding container 5ccd868d0ec54bbc22e013bc3b35ccea5c2e9ed885483545836dc85025757f3c: Status 404 returned error can't find the container with id 5ccd868d0ec54bbc22e013bc3b35ccea5c2e9ed885483545836dc85025757f3c Mar 07 07:07:11 crc kubenswrapper[4941]: I0307 07:07:11.959176 4941 generic.go:334] "Generic (PLEG): container finished" podID="bd5d5308-944e-4c42-a452-049f94c4d06b" containerID="da53bc3b556a4505dc899b5affaaa550c6f4283791ab4f7d2b42496adc703420" exitCode=0 Mar 07 07:07:11 crc kubenswrapper[4941]: I0307 07:07:11.964776 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:07:11 crc kubenswrapper[4941]: I0307 07:07:11.970010 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" event={"ID":"bd5d5308-944e-4c42-a452-049f94c4d06b","Type":"ContainerDied","Data":"da53bc3b556a4505dc899b5affaaa550c6f4283791ab4f7d2b42496adc703420"} Mar 07 07:07:11 crc kubenswrapper[4941]: I0307 07:07:11.970071 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" event={"ID":"bd5d5308-944e-4c42-a452-049f94c4d06b","Type":"ContainerStarted","Data":"5ccd868d0ec54bbc22e013bc3b35ccea5c2e9ed885483545836dc85025757f3c"} Mar 07 07:07:12 crc kubenswrapper[4941]: I0307 07:07:12.915974 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nwzjs" podUID="46da50cb-1038-4289-be6d-e5f3b4c70ab3" containerName="console" containerID="cri-o://65671d5efd2e4ead55599ceefe91ff4b1d17a45a0827282d343bcab62af6dfec" gracePeriod=15 Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.314187 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nwzjs_46da50cb-1038-4289-be6d-e5f3b4c70ab3/console/0.log" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.314656 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.477380 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-serving-cert\") pod \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.477477 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-trusted-ca-bundle\") pod \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.477549 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-oauth-config\") pod \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.477605 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vstnp\" (UniqueName: \"kubernetes.io/projected/46da50cb-1038-4289-be6d-e5f3b4c70ab3-kube-api-access-vstnp\") pod \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.477635 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-config\") pod \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.477816 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-service-ca\") pod \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.477869 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-oauth-serving-cert\") pod \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\" (UID: \"46da50cb-1038-4289-be6d-e5f3b4c70ab3\") " Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.479096 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "46da50cb-1038-4289-be6d-e5f3b4c70ab3" (UID: "46da50cb-1038-4289-be6d-e5f3b4c70ab3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.479559 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "46da50cb-1038-4289-be6d-e5f3b4c70ab3" (UID: "46da50cb-1038-4289-be6d-e5f3b4c70ab3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.480329 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-config" (OuterVolumeSpecName: "console-config") pod "46da50cb-1038-4289-be6d-e5f3b4c70ab3" (UID: "46da50cb-1038-4289-be6d-e5f3b4c70ab3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.480994 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-service-ca" (OuterVolumeSpecName: "service-ca") pod "46da50cb-1038-4289-be6d-e5f3b4c70ab3" (UID: "46da50cb-1038-4289-be6d-e5f3b4c70ab3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.484968 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "46da50cb-1038-4289-be6d-e5f3b4c70ab3" (UID: "46da50cb-1038-4289-be6d-e5f3b4c70ab3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.533504 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "46da50cb-1038-4289-be6d-e5f3b4c70ab3" (UID: "46da50cb-1038-4289-be6d-e5f3b4c70ab3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.540634 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46da50cb-1038-4289-be6d-e5f3b4c70ab3-kube-api-access-vstnp" (OuterVolumeSpecName: "kube-api-access-vstnp") pod "46da50cb-1038-4289-be6d-e5f3b4c70ab3" (UID: "46da50cb-1038-4289-be6d-e5f3b4c70ab3"). InnerVolumeSpecName "kube-api-access-vstnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.579795 4941 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.579827 4941 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.579837 4941 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.579848 4941 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.579857 4941 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.579867 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vstnp\" (UniqueName: \"kubernetes.io/projected/46da50cb-1038-4289-be6d-e5f3b4c70ab3-kube-api-access-vstnp\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.579875 4941 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46da50cb-1038-4289-be6d-e5f3b4c70ab3-console-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.978204 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nwzjs_46da50cb-1038-4289-be6d-e5f3b4c70ab3/console/0.log" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.978313 4941 generic.go:334] "Generic (PLEG): container finished" podID="46da50cb-1038-4289-be6d-e5f3b4c70ab3" containerID="65671d5efd2e4ead55599ceefe91ff4b1d17a45a0827282d343bcab62af6dfec" exitCode=2 Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.978514 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nwzjs" event={"ID":"46da50cb-1038-4289-be6d-e5f3b4c70ab3","Type":"ContainerDied","Data":"65671d5efd2e4ead55599ceefe91ff4b1d17a45a0827282d343bcab62af6dfec"} Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.978567 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nwzjs" event={"ID":"46da50cb-1038-4289-be6d-e5f3b4c70ab3","Type":"ContainerDied","Data":"5dab62067ec5636ab980e7b59d12f6cb4858ac3410a31593e4ac1320d927873d"} Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.978571 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nwzjs" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.978590 4941 scope.go:117] "RemoveContainer" containerID="65671d5efd2e4ead55599ceefe91ff4b1d17a45a0827282d343bcab62af6dfec" Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.983869 4941 generic.go:334] "Generic (PLEG): container finished" podID="bd5d5308-944e-4c42-a452-049f94c4d06b" containerID="85140cf491e52ffb894bd5f00ded02715e7579417af5f7a7e564ce48c9c53226" exitCode=0 Mar 07 07:07:13 crc kubenswrapper[4941]: I0307 07:07:13.983930 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" event={"ID":"bd5d5308-944e-4c42-a452-049f94c4d06b","Type":"ContainerDied","Data":"85140cf491e52ffb894bd5f00ded02715e7579417af5f7a7e564ce48c9c53226"} Mar 07 07:07:14 crc kubenswrapper[4941]: I0307 07:07:14.032042 4941 scope.go:117] "RemoveContainer" containerID="65671d5efd2e4ead55599ceefe91ff4b1d17a45a0827282d343bcab62af6dfec" Mar 07 07:07:14 crc kubenswrapper[4941]: E0307 07:07:14.045714 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65671d5efd2e4ead55599ceefe91ff4b1d17a45a0827282d343bcab62af6dfec\": container with ID starting with 65671d5efd2e4ead55599ceefe91ff4b1d17a45a0827282d343bcab62af6dfec not found: ID does not exist" containerID="65671d5efd2e4ead55599ceefe91ff4b1d17a45a0827282d343bcab62af6dfec" Mar 07 07:07:14 crc kubenswrapper[4941]: I0307 07:07:14.045820 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65671d5efd2e4ead55599ceefe91ff4b1d17a45a0827282d343bcab62af6dfec"} err="failed to get container status \"65671d5efd2e4ead55599ceefe91ff4b1d17a45a0827282d343bcab62af6dfec\": rpc error: code = NotFound desc = could not find container \"65671d5efd2e4ead55599ceefe91ff4b1d17a45a0827282d343bcab62af6dfec\": container with ID starting with 65671d5efd2e4ead55599ceefe91ff4b1d17a45a0827282d343bcab62af6dfec not found: ID does not exist" Mar 07 07:07:14 crc kubenswrapper[4941]: I0307 07:07:14.055139 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nwzjs"] Mar 07 07:07:14 crc kubenswrapper[4941]: I0307 07:07:14.062628 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nwzjs"] Mar 07 07:07:14 crc kubenswrapper[4941]: I0307 07:07:14.994794 4941 generic.go:334] "Generic (PLEG): container finished" podID="bd5d5308-944e-4c42-a452-049f94c4d06b" containerID="71ab9906522b3d142e4d7f8cb1646a8bb1ef5baa6d7ef78127af4da65cccb507" exitCode=0 Mar 07 07:07:14 crc kubenswrapper[4941]: I0307 07:07:14.994871 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" event={"ID":"bd5d5308-944e-4c42-a452-049f94c4d06b","Type":"ContainerDied","Data":"71ab9906522b3d142e4d7f8cb1646a8bb1ef5baa6d7ef78127af4da65cccb507"} Mar 07 07:07:15 crc kubenswrapper[4941]: I0307 07:07:15.963109 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46da50cb-1038-4289-be6d-e5f3b4c70ab3" path="/var/lib/kubelet/pods/46da50cb-1038-4289-be6d-e5f3b4c70ab3/volumes" Mar 07 07:07:16 crc kubenswrapper[4941]: I0307 07:07:16.232071 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" Mar 07 07:07:16 crc kubenswrapper[4941]: I0307 07:07:16.417059 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmh5s\" (UniqueName: \"kubernetes.io/projected/bd5d5308-944e-4c42-a452-049f94c4d06b-kube-api-access-dmh5s\") pod \"bd5d5308-944e-4c42-a452-049f94c4d06b\" (UID: \"bd5d5308-944e-4c42-a452-049f94c4d06b\") " Mar 07 07:07:16 crc kubenswrapper[4941]: I0307 07:07:16.417140 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd5d5308-944e-4c42-a452-049f94c4d06b-util\") pod \"bd5d5308-944e-4c42-a452-049f94c4d06b\" (UID: \"bd5d5308-944e-4c42-a452-049f94c4d06b\") " Mar 07 07:07:16 crc kubenswrapper[4941]: I0307 07:07:16.418319 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd5d5308-944e-4c42-a452-049f94c4d06b-bundle\") pod \"bd5d5308-944e-4c42-a452-049f94c4d06b\" (UID: \"bd5d5308-944e-4c42-a452-049f94c4d06b\") " Mar 07 07:07:16 crc kubenswrapper[4941]: I0307 07:07:16.419051 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd5d5308-944e-4c42-a452-049f94c4d06b-bundle" (OuterVolumeSpecName: "bundle") pod "bd5d5308-944e-4c42-a452-049f94c4d06b" (UID: "bd5d5308-944e-4c42-a452-049f94c4d06b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:07:16 crc kubenswrapper[4941]: I0307 07:07:16.424328 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5d5308-944e-4c42-a452-049f94c4d06b-kube-api-access-dmh5s" (OuterVolumeSpecName: "kube-api-access-dmh5s") pod "bd5d5308-944e-4c42-a452-049f94c4d06b" (UID: "bd5d5308-944e-4c42-a452-049f94c4d06b"). InnerVolumeSpecName "kube-api-access-dmh5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:07:16 crc kubenswrapper[4941]: I0307 07:07:16.435963 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd5d5308-944e-4c42-a452-049f94c4d06b-util" (OuterVolumeSpecName: "util") pod "bd5d5308-944e-4c42-a452-049f94c4d06b" (UID: "bd5d5308-944e-4c42-a452-049f94c4d06b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:07:16 crc kubenswrapper[4941]: I0307 07:07:16.520179 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmh5s\" (UniqueName: \"kubernetes.io/projected/bd5d5308-944e-4c42-a452-049f94c4d06b-kube-api-access-dmh5s\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:16 crc kubenswrapper[4941]: I0307 07:07:16.520224 4941 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd5d5308-944e-4c42-a452-049f94c4d06b-util\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:16 crc kubenswrapper[4941]: I0307 07:07:16.520233 4941 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd5d5308-944e-4c42-a452-049f94c4d06b-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:17 crc kubenswrapper[4941]: I0307 07:07:17.028657 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" event={"ID":"bd5d5308-944e-4c42-a452-049f94c4d06b","Type":"ContainerDied","Data":"5ccd868d0ec54bbc22e013bc3b35ccea5c2e9ed885483545836dc85025757f3c"} Mar 07 07:07:17 crc kubenswrapper[4941]: I0307 07:07:17.028774 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ccd868d0ec54bbc22e013bc3b35ccea5c2e9ed885483545836dc85025757f3c" Mar 07 07:07:17 crc kubenswrapper[4941]: I0307 07:07:17.028952 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.061039 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6"] Mar 07 07:07:26 crc kubenswrapper[4941]: E0307 07:07:26.061942 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5d5308-944e-4c42-a452-049f94c4d06b" containerName="util" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.061956 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5d5308-944e-4c42-a452-049f94c4d06b" containerName="util" Mar 07 07:07:26 crc kubenswrapper[4941]: E0307 07:07:26.061973 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46da50cb-1038-4289-be6d-e5f3b4c70ab3" containerName="console" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.061980 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="46da50cb-1038-4289-be6d-e5f3b4c70ab3" containerName="console" Mar 07 07:07:26 crc kubenswrapper[4941]: E0307 07:07:26.061989 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5d5308-944e-4c42-a452-049f94c4d06b" containerName="extract" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.061995 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5d5308-944e-4c42-a452-049f94c4d06b" containerName="extract" Mar 07 07:07:26 crc kubenswrapper[4941]: E0307 07:07:26.062010 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5d5308-944e-4c42-a452-049f94c4d06b" containerName="pull" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.062016 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5d5308-944e-4c42-a452-049f94c4d06b" containerName="pull" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.062098 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5d5308-944e-4c42-a452-049f94c4d06b" containerName="extract" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.062115 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="46da50cb-1038-4289-be6d-e5f3b4c70ab3" containerName="console" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.062558 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.064449 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fd5hr" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.064602 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.068244 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.068568 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.068862 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.084065 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6"] Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.245372 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdtxr\" (UniqueName: \"kubernetes.io/projected/40a1b026-57b0-4461-ab92-8e21f5ba9769-kube-api-access-zdtxr\") pod \"metallb-operator-controller-manager-5fcd57bf5c-fvjx6\" (UID: \"40a1b026-57b0-4461-ab92-8e21f5ba9769\") " pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.245766 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/40a1b026-57b0-4461-ab92-8e21f5ba9769-apiservice-cert\") pod \"metallb-operator-controller-manager-5fcd57bf5c-fvjx6\" (UID: \"40a1b026-57b0-4461-ab92-8e21f5ba9769\") " pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.245861 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/40a1b026-57b0-4461-ab92-8e21f5ba9769-webhook-cert\") pod \"metallb-operator-controller-manager-5fcd57bf5c-fvjx6\" (UID: \"40a1b026-57b0-4461-ab92-8e21f5ba9769\") " pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.309909 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-75bb886-4qfz7"] Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.310786 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.313073 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.313359 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.314039 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-cssvr" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.333616 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-75bb886-4qfz7"] Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.361689 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/40a1b026-57b0-4461-ab92-8e21f5ba9769-webhook-cert\") pod \"metallb-operator-controller-manager-5fcd57bf5c-fvjx6\" (UID: \"40a1b026-57b0-4461-ab92-8e21f5ba9769\") " pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.361792 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdtxr\" (UniqueName: \"kubernetes.io/projected/40a1b026-57b0-4461-ab92-8e21f5ba9769-kube-api-access-zdtxr\") pod \"metallb-operator-controller-manager-5fcd57bf5c-fvjx6\" (UID: \"40a1b026-57b0-4461-ab92-8e21f5ba9769\") " pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.361854 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/40a1b026-57b0-4461-ab92-8e21f5ba9769-apiservice-cert\") pod \"metallb-operator-controller-manager-5fcd57bf5c-fvjx6\" (UID: \"40a1b026-57b0-4461-ab92-8e21f5ba9769\") " pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.378247 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/40a1b026-57b0-4461-ab92-8e21f5ba9769-apiservice-cert\") pod \"metallb-operator-controller-manager-5fcd57bf5c-fvjx6\" (UID: \"40a1b026-57b0-4461-ab92-8e21f5ba9769\") " pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.407688 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/40a1b026-57b0-4461-ab92-8e21f5ba9769-webhook-cert\") pod \"metallb-operator-controller-manager-5fcd57bf5c-fvjx6\" (UID: \"40a1b026-57b0-4461-ab92-8e21f5ba9769\") " pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.408020 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdtxr\" (UniqueName: \"kubernetes.io/projected/40a1b026-57b0-4461-ab92-8e21f5ba9769-kube-api-access-zdtxr\") pod \"metallb-operator-controller-manager-5fcd57bf5c-fvjx6\" (UID: \"40a1b026-57b0-4461-ab92-8e21f5ba9769\") " pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.462925 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zjqg\" (UniqueName: \"kubernetes.io/projected/a9be712c-d754-4d39-b871-4199924fa125-kube-api-access-4zjqg\") pod \"metallb-operator-webhook-server-75bb886-4qfz7\" (UID: \"a9be712c-d754-4d39-b871-4199924fa125\") " pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.462997 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a9be712c-d754-4d39-b871-4199924fa125-apiservice-cert\") pod \"metallb-operator-webhook-server-75bb886-4qfz7\" (UID: \"a9be712c-d754-4d39-b871-4199924fa125\") " pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.463022 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a9be712c-d754-4d39-b871-4199924fa125-webhook-cert\") pod \"metallb-operator-webhook-server-75bb886-4qfz7\" (UID: \"a9be712c-d754-4d39-b871-4199924fa125\") " pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.564535 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a9be712c-d754-4d39-b871-4199924fa125-apiservice-cert\") pod \"metallb-operator-webhook-server-75bb886-4qfz7\" (UID: \"a9be712c-d754-4d39-b871-4199924fa125\") " pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.564606 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a9be712c-d754-4d39-b871-4199924fa125-webhook-cert\") pod \"metallb-operator-webhook-server-75bb886-4qfz7\" (UID: \"a9be712c-d754-4d39-b871-4199924fa125\") " pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.564687 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zjqg\" (UniqueName: \"kubernetes.io/projected/a9be712c-d754-4d39-b871-4199924fa125-kube-api-access-4zjqg\") pod \"metallb-operator-webhook-server-75bb886-4qfz7\" (UID: \"a9be712c-d754-4d39-b871-4199924fa125\") " pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.575116 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a9be712c-d754-4d39-b871-4199924fa125-apiservice-cert\") pod \"metallb-operator-webhook-server-75bb886-4qfz7\" (UID: \"a9be712c-d754-4d39-b871-4199924fa125\") " pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.580883 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a9be712c-d754-4d39-b871-4199924fa125-webhook-cert\") pod \"metallb-operator-webhook-server-75bb886-4qfz7\" (UID: \"a9be712c-d754-4d39-b871-4199924fa125\") " pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.581604 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zjqg\" (UniqueName: \"kubernetes.io/projected/a9be712c-d754-4d39-b871-4199924fa125-kube-api-access-4zjqg\") pod \"metallb-operator-webhook-server-75bb886-4qfz7\" (UID: \"a9be712c-d754-4d39-b871-4199924fa125\") " pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.626276 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.678985 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" Mar 07 07:07:26 crc kubenswrapper[4941]: I0307 07:07:26.873363 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-75bb886-4qfz7"] Mar 07 07:07:27 crc kubenswrapper[4941]: I0307 07:07:27.094483 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" event={"ID":"a9be712c-d754-4d39-b871-4199924fa125","Type":"ContainerStarted","Data":"89e0cf0b664deb1dd55a290bfe2d6d15f78bae5da6895256c91c9bd6d4e901e9"} Mar 07 07:07:27 crc kubenswrapper[4941]: I0307 07:07:27.190352 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6"] Mar 07 07:07:27 crc kubenswrapper[4941]: W0307 07:07:27.199516 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40a1b026_57b0_4461_ab92_8e21f5ba9769.slice/crio-a5957ca4a033dba253a89093ae48c58d29cb7e27a3a130ad6a6feacf7671c70a WatchSource:0}: Error finding container a5957ca4a033dba253a89093ae48c58d29cb7e27a3a130ad6a6feacf7671c70a: Status 404 returned error can't find the container with id a5957ca4a033dba253a89093ae48c58d29cb7e27a3a130ad6a6feacf7671c70a Mar 07 07:07:28 crc kubenswrapper[4941]: I0307 07:07:28.104871 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" event={"ID":"40a1b026-57b0-4461-ab92-8e21f5ba9769","Type":"ContainerStarted","Data":"a5957ca4a033dba253a89093ae48c58d29cb7e27a3a130ad6a6feacf7671c70a"} Mar 07 07:07:33 crc kubenswrapper[4941]: I0307 07:07:33.155160 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" event={"ID":"a9be712c-d754-4d39-b871-4199924fa125","Type":"ContainerStarted","Data":"3a24031481b124945e6016b74592654030f0bb978dacbf3cd8dfd58fd577f13d"} Mar 07 07:07:33 crc kubenswrapper[4941]: I0307 07:07:33.157508 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" Mar 07 07:07:33 crc kubenswrapper[4941]: I0307 07:07:33.159856 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" event={"ID":"40a1b026-57b0-4461-ab92-8e21f5ba9769","Type":"ContainerStarted","Data":"4af4250ce730d60823fda2df557160add94fe385d976e09a39bd53ab0e9af169"} Mar 07 07:07:33 crc kubenswrapper[4941]: I0307 07:07:33.160194 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" Mar 07 07:07:33 crc kubenswrapper[4941]: I0307 07:07:33.179253 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" podStartSLOduration=1.605819598 podStartE2EDuration="7.179228175s" podCreationTimestamp="2026-03-07 07:07:26 +0000 UTC" firstStartedPulling="2026-03-07 07:07:26.89020615 +0000 UTC m=+943.842571615" lastFinishedPulling="2026-03-07 07:07:32.463614727 +0000 UTC m=+949.415980192" observedRunningTime="2026-03-07 07:07:33.17580658 +0000 UTC m=+950.128172045" watchObservedRunningTime="2026-03-07 07:07:33.179228175 +0000 UTC m=+950.131593640" Mar 07 07:07:33 crc kubenswrapper[4941]: I0307 07:07:33.203437 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" podStartSLOduration=1.9562693439999999 podStartE2EDuration="7.203397592s" podCreationTimestamp="2026-03-07 07:07:26 +0000 UTC" firstStartedPulling="2026-03-07 07:07:27.203023667 +0000 UTC m=+944.155389132" lastFinishedPulling="2026-03-07 07:07:32.450151915 +0000 UTC m=+949.402517380" observedRunningTime="2026-03-07 07:07:33.198153873 +0000 UTC m=+950.150519358" watchObservedRunningTime="2026-03-07 07:07:33.203397592 +0000 UTC m=+950.155763057" Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.168374 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7dlgn"] Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.169591 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.187474 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7dlgn"] Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.240065 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f6a59b-3d52-4c22-b2c0-054f2c691934-utilities\") pod \"community-operators-7dlgn\" (UID: \"25f6a59b-3d52-4c22-b2c0-054f2c691934\") " pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.240161 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f6a59b-3d52-4c22-b2c0-054f2c691934-catalog-content\") pod \"community-operators-7dlgn\" (UID: \"25f6a59b-3d52-4c22-b2c0-054f2c691934\") " pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.240224 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8x9m\" (UniqueName: \"kubernetes.io/projected/25f6a59b-3d52-4c22-b2c0-054f2c691934-kube-api-access-b8x9m\") pod \"community-operators-7dlgn\" (UID: \"25f6a59b-3d52-4c22-b2c0-054f2c691934\") " pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.341757 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f6a59b-3d52-4c22-b2c0-054f2c691934-catalog-content\") pod \"community-operators-7dlgn\" (UID: \"25f6a59b-3d52-4c22-b2c0-054f2c691934\") " pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.342098 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8x9m\" (UniqueName: \"kubernetes.io/projected/25f6a59b-3d52-4c22-b2c0-054f2c691934-kube-api-access-b8x9m\") pod \"community-operators-7dlgn\" (UID: \"25f6a59b-3d52-4c22-b2c0-054f2c691934\") " pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.342227 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f6a59b-3d52-4c22-b2c0-054f2c691934-utilities\") pod \"community-operators-7dlgn\" (UID: \"25f6a59b-3d52-4c22-b2c0-054f2c691934\") " pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.342368 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f6a59b-3d52-4c22-b2c0-054f2c691934-catalog-content\") pod \"community-operators-7dlgn\" (UID: \"25f6a59b-3d52-4c22-b2c0-054f2c691934\") " pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.342779 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f6a59b-3d52-4c22-b2c0-054f2c691934-utilities\") pod \"community-operators-7dlgn\" (UID: \"25f6a59b-3d52-4c22-b2c0-054f2c691934\") " pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.367783 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8x9m\" (UniqueName: \"kubernetes.io/projected/25f6a59b-3d52-4c22-b2c0-054f2c691934-kube-api-access-b8x9m\") pod \"community-operators-7dlgn\" (UID: \"25f6a59b-3d52-4c22-b2c0-054f2c691934\") " pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.490175 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:35 crc kubenswrapper[4941]: I0307 07:07:35.796691 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7dlgn"] Mar 07 07:07:36 crc kubenswrapper[4941]: I0307 07:07:36.177945 4941 generic.go:334] "Generic (PLEG): container finished" podID="25f6a59b-3d52-4c22-b2c0-054f2c691934" containerID="9be3363b324069f8c04dacd4008b738e1e328540fc7675b0bda5598043e5e783" exitCode=0 Mar 07 07:07:36 crc kubenswrapper[4941]: I0307 07:07:36.178027 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dlgn" event={"ID":"25f6a59b-3d52-4c22-b2c0-054f2c691934","Type":"ContainerDied","Data":"9be3363b324069f8c04dacd4008b738e1e328540fc7675b0bda5598043e5e783"} Mar 07 07:07:36 crc kubenswrapper[4941]: I0307 07:07:36.178475 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dlgn" event={"ID":"25f6a59b-3d52-4c22-b2c0-054f2c691934","Type":"ContainerStarted","Data":"526d11f48b59eadc9083590757d04064000b68632370a88f9a0b1e9e956477a4"} Mar 07 07:07:37 crc kubenswrapper[4941]: I0307 07:07:37.191724 4941 generic.go:334] "Generic (PLEG): container finished" podID="25f6a59b-3d52-4c22-b2c0-054f2c691934" containerID="60323b783e5d5d74f791ecd5fe32cc95f84aee9a588cdeb45ca6265cca400960" exitCode=0 Mar 07 07:07:37 crc kubenswrapper[4941]: I0307 07:07:37.191826 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dlgn" event={"ID":"25f6a59b-3d52-4c22-b2c0-054f2c691934","Type":"ContainerDied","Data":"60323b783e5d5d74f791ecd5fe32cc95f84aee9a588cdeb45ca6265cca400960"} Mar 07 07:07:38 crc kubenswrapper[4941]: I0307 07:07:38.201936 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dlgn" event={"ID":"25f6a59b-3d52-4c22-b2c0-054f2c691934","Type":"ContainerStarted","Data":"dbec95b3f3c98960bbc339af416339913f7f5612b3fde222233ef5b246edc1ae"} Mar 07 07:07:40 crc kubenswrapper[4941]: I0307 07:07:40.314692 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:07:40 crc kubenswrapper[4941]: I0307 07:07:40.314774 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:07:45 crc kubenswrapper[4941]: I0307 07:07:45.490644 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:45 crc kubenswrapper[4941]: I0307 07:07:45.492176 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:45 crc kubenswrapper[4941]: I0307 07:07:45.542433 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:45 crc kubenswrapper[4941]: I0307 07:07:45.559337 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7dlgn" podStartSLOduration=9.100119071 podStartE2EDuration="10.559317257s" podCreationTimestamp="2026-03-07 07:07:35 +0000 UTC" firstStartedPulling="2026-03-07 07:07:36.179424328 +0000 UTC m=+953.131789803" lastFinishedPulling="2026-03-07 07:07:37.638622504 +0000 UTC m=+954.590987989" observedRunningTime="2026-03-07 07:07:38.218488119 +0000 UTC m=+955.170853584" watchObservedRunningTime="2026-03-07 07:07:45.559317257 +0000 UTC m=+962.511682722" Mar 07 07:07:46 crc kubenswrapper[4941]: I0307 07:07:46.310229 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:46 crc kubenswrapper[4941]: I0307 07:07:46.633925 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-75bb886-4qfz7" Mar 07 07:07:48 crc kubenswrapper[4941]: I0307 07:07:48.545562 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7dlgn"] Mar 07 07:07:48 crc kubenswrapper[4941]: I0307 07:07:48.546640 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7dlgn" podUID="25f6a59b-3d52-4c22-b2c0-054f2c691934" containerName="registry-server" containerID="cri-o://dbec95b3f3c98960bbc339af416339913f7f5612b3fde222233ef5b246edc1ae" gracePeriod=2 Mar 07 07:07:49 crc kubenswrapper[4941]: I0307 07:07:49.280997 4941 generic.go:334] "Generic (PLEG): container finished" podID="25f6a59b-3d52-4c22-b2c0-054f2c691934" containerID="dbec95b3f3c98960bbc339af416339913f7f5612b3fde222233ef5b246edc1ae" exitCode=0 Mar 07 07:07:49 crc kubenswrapper[4941]: I0307 07:07:49.281063 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dlgn" event={"ID":"25f6a59b-3d52-4c22-b2c0-054f2c691934","Type":"ContainerDied","Data":"dbec95b3f3c98960bbc339af416339913f7f5612b3fde222233ef5b246edc1ae"} Mar 07 07:07:49 crc kubenswrapper[4941]: I0307 07:07:49.435954 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:49 crc kubenswrapper[4941]: I0307 07:07:49.539559 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f6a59b-3d52-4c22-b2c0-054f2c691934-catalog-content\") pod \"25f6a59b-3d52-4c22-b2c0-054f2c691934\" (UID: \"25f6a59b-3d52-4c22-b2c0-054f2c691934\") " Mar 07 07:07:49 crc kubenswrapper[4941]: I0307 07:07:49.539647 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8x9m\" (UniqueName: \"kubernetes.io/projected/25f6a59b-3d52-4c22-b2c0-054f2c691934-kube-api-access-b8x9m\") pod \"25f6a59b-3d52-4c22-b2c0-054f2c691934\" (UID: \"25f6a59b-3d52-4c22-b2c0-054f2c691934\") " Mar 07 07:07:49 crc kubenswrapper[4941]: I0307 07:07:49.539710 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f6a59b-3d52-4c22-b2c0-054f2c691934-utilities\") pod \"25f6a59b-3d52-4c22-b2c0-054f2c691934\" (UID: \"25f6a59b-3d52-4c22-b2c0-054f2c691934\") " Mar 07 07:07:49 crc kubenswrapper[4941]: I0307 07:07:49.540911 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f6a59b-3d52-4c22-b2c0-054f2c691934-utilities" (OuterVolumeSpecName: "utilities") pod "25f6a59b-3d52-4c22-b2c0-054f2c691934" (UID: "25f6a59b-3d52-4c22-b2c0-054f2c691934"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:07:49 crc kubenswrapper[4941]: I0307 07:07:49.548471 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f6a59b-3d52-4c22-b2c0-054f2c691934-kube-api-access-b8x9m" (OuterVolumeSpecName: "kube-api-access-b8x9m") pod "25f6a59b-3d52-4c22-b2c0-054f2c691934" (UID: "25f6a59b-3d52-4c22-b2c0-054f2c691934"). InnerVolumeSpecName "kube-api-access-b8x9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:07:49 crc kubenswrapper[4941]: I0307 07:07:49.588844 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f6a59b-3d52-4c22-b2c0-054f2c691934-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25f6a59b-3d52-4c22-b2c0-054f2c691934" (UID: "25f6a59b-3d52-4c22-b2c0-054f2c691934"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:07:49 crc kubenswrapper[4941]: I0307 07:07:49.641759 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f6a59b-3d52-4c22-b2c0-054f2c691934-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:49 crc kubenswrapper[4941]: I0307 07:07:49.641797 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f6a59b-3d52-4c22-b2c0-054f2c691934-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:49 crc kubenswrapper[4941]: I0307 07:07:49.641808 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8x9m\" (UniqueName: \"kubernetes.io/projected/25f6a59b-3d52-4c22-b2c0-054f2c691934-kube-api-access-b8x9m\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.157382 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-njwjw"] Mar 07 07:07:50 crc kubenswrapper[4941]: E0307 07:07:50.157601 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f6a59b-3d52-4c22-b2c0-054f2c691934" containerName="extract-content" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.157613 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f6a59b-3d52-4c22-b2c0-054f2c691934" containerName="extract-content" Mar 07 07:07:50 crc kubenswrapper[4941]: E0307 07:07:50.157622 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f6a59b-3d52-4c22-b2c0-054f2c691934" containerName="registry-server" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.157629 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f6a59b-3d52-4c22-b2c0-054f2c691934" containerName="registry-server" Mar 07 07:07:50 crc kubenswrapper[4941]: E0307 07:07:50.157652 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f6a59b-3d52-4c22-b2c0-054f2c691934" containerName="extract-utilities" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.157666 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f6a59b-3d52-4c22-b2c0-054f2c691934" containerName="extract-utilities" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.157809 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="25f6a59b-3d52-4c22-b2c0-054f2c691934" containerName="registry-server" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.159017 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.171293 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-njwjw"] Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.290005 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dlgn" event={"ID":"25f6a59b-3d52-4c22-b2c0-054f2c691934","Type":"ContainerDied","Data":"526d11f48b59eadc9083590757d04064000b68632370a88f9a0b1e9e956477a4"} Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.290064 4941 scope.go:117] "RemoveContainer" containerID="dbec95b3f3c98960bbc339af416339913f7f5612b3fde222233ef5b246edc1ae" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.290084 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7dlgn" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.308306 4941 scope.go:117] "RemoveContainer" containerID="60323b783e5d5d74f791ecd5fe32cc95f84aee9a588cdeb45ca6265cca400960" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.314391 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7dlgn"] Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.324784 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7dlgn"] Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.328065 4941 scope.go:117] "RemoveContainer" containerID="9be3363b324069f8c04dacd4008b738e1e328540fc7675b0bda5598043e5e783" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.349723 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da268ef7-cdee-4536-9504-1381de4ea636-utilities\") pod \"certified-operators-njwjw\" (UID: \"da268ef7-cdee-4536-9504-1381de4ea636\") " pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.349787 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da268ef7-cdee-4536-9504-1381de4ea636-catalog-content\") pod \"certified-operators-njwjw\" (UID: \"da268ef7-cdee-4536-9504-1381de4ea636\") " pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.349879 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng4mg\" (UniqueName: \"kubernetes.io/projected/da268ef7-cdee-4536-9504-1381de4ea636-kube-api-access-ng4mg\") pod \"certified-operators-njwjw\" (UID: \"da268ef7-cdee-4536-9504-1381de4ea636\") " pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.451521 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng4mg\" (UniqueName: \"kubernetes.io/projected/da268ef7-cdee-4536-9504-1381de4ea636-kube-api-access-ng4mg\") pod \"certified-operators-njwjw\" (UID: \"da268ef7-cdee-4536-9504-1381de4ea636\") " pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.451581 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da268ef7-cdee-4536-9504-1381de4ea636-utilities\") pod \"certified-operators-njwjw\" (UID: \"da268ef7-cdee-4536-9504-1381de4ea636\") " pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.451612 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da268ef7-cdee-4536-9504-1381de4ea636-catalog-content\") pod \"certified-operators-njwjw\" (UID: \"da268ef7-cdee-4536-9504-1381de4ea636\") " pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.452121 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da268ef7-cdee-4536-9504-1381de4ea636-catalog-content\") pod \"certified-operators-njwjw\" (UID: \"da268ef7-cdee-4536-9504-1381de4ea636\") " pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.452291 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da268ef7-cdee-4536-9504-1381de4ea636-utilities\") pod \"certified-operators-njwjw\" (UID: \"da268ef7-cdee-4536-9504-1381de4ea636\") " pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.471604 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng4mg\" (UniqueName: \"kubernetes.io/projected/da268ef7-cdee-4536-9504-1381de4ea636-kube-api-access-ng4mg\") pod \"certified-operators-njwjw\" (UID: \"da268ef7-cdee-4536-9504-1381de4ea636\") " pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.477564 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:07:50 crc kubenswrapper[4941]: I0307 07:07:50.695100 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-njwjw"] Mar 07 07:07:51 crc kubenswrapper[4941]: I0307 07:07:51.297794 4941 generic.go:334] "Generic (PLEG): container finished" podID="da268ef7-cdee-4536-9504-1381de4ea636" containerID="e2b9e7ca06ffe1c81b265206c8eac0b4e72311449d08eab88dfd738a616536a5" exitCode=0 Mar 07 07:07:51 crc kubenswrapper[4941]: I0307 07:07:51.297851 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njwjw" event={"ID":"da268ef7-cdee-4536-9504-1381de4ea636","Type":"ContainerDied","Data":"e2b9e7ca06ffe1c81b265206c8eac0b4e72311449d08eab88dfd738a616536a5"} Mar 07 07:07:51 crc kubenswrapper[4941]: I0307 07:07:51.298300 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njwjw" event={"ID":"da268ef7-cdee-4536-9504-1381de4ea636","Type":"ContainerStarted","Data":"b06ca5af17b0f524a44add003a6fba5c3f7de2e653d30d4c88cc07f69b2d61a3"} Mar 07 07:07:51 crc kubenswrapper[4941]: I0307 07:07:51.963931 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25f6a59b-3d52-4c22-b2c0-054f2c691934" path="/var/lib/kubelet/pods/25f6a59b-3d52-4c22-b2c0-054f2c691934/volumes" Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.164211 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c4k2z"] Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.165511 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.177463 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4k2z"] Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.277901 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-catalog-content\") pod \"redhat-marketplace-c4k2z\" (UID: \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\") " pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.278029 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wldn\" (UniqueName: \"kubernetes.io/projected/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-kube-api-access-4wldn\") pod \"redhat-marketplace-c4k2z\" (UID: \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\") " pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.278388 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-utilities\") pod \"redhat-marketplace-c4k2z\" (UID: \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\") " pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.379926 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wldn\" (UniqueName: \"kubernetes.io/projected/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-kube-api-access-4wldn\") pod \"redhat-marketplace-c4k2z\" (UID: \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\") " pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.380035 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-utilities\") pod \"redhat-marketplace-c4k2z\" (UID: \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\") " pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.380073 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-catalog-content\") pod \"redhat-marketplace-c4k2z\" (UID: \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\") " pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.380498 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-utilities\") pod \"redhat-marketplace-c4k2z\" (UID: \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\") " pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.380542 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-catalog-content\") pod \"redhat-marketplace-c4k2z\" (UID: \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\") " pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.410422 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wldn\" (UniqueName: \"kubernetes.io/projected/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-kube-api-access-4wldn\") pod \"redhat-marketplace-c4k2z\" (UID: \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\") " pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.486989 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:07:52 crc kubenswrapper[4941]: I0307 07:07:52.907132 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4k2z"] Mar 07 07:07:53 crc kubenswrapper[4941]: I0307 07:07:53.313920 4941 generic.go:334] "Generic (PLEG): container finished" podID="da268ef7-cdee-4536-9504-1381de4ea636" containerID="3ce74c44302a799d519feabbea40dfe2796f663525836cdcff06f9ff5801f72c" exitCode=0 Mar 07 07:07:53 crc kubenswrapper[4941]: I0307 07:07:53.314031 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njwjw" event={"ID":"da268ef7-cdee-4536-9504-1381de4ea636","Type":"ContainerDied","Data":"3ce74c44302a799d519feabbea40dfe2796f663525836cdcff06f9ff5801f72c"} Mar 07 07:07:53 crc kubenswrapper[4941]: I0307 07:07:53.316655 4941 generic.go:334] "Generic (PLEG): container finished" podID="556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" containerID="33485b3fb65d24741e450ac751008cfcafa1c19630dc5f11d4d387da429573a0" exitCode=0 Mar 07 07:07:53 crc kubenswrapper[4941]: I0307 07:07:53.316693 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4k2z" event={"ID":"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa","Type":"ContainerDied","Data":"33485b3fb65d24741e450ac751008cfcafa1c19630dc5f11d4d387da429573a0"} Mar 07 07:07:53 crc kubenswrapper[4941]: I0307 07:07:53.316717 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4k2z" event={"ID":"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa","Type":"ContainerStarted","Data":"be2957c64af3b282c63e6574939e2e36c2be73d8c9d82ce736bf5af79b47731c"} Mar 07 07:07:54 crc kubenswrapper[4941]: I0307 07:07:54.326447 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njwjw" event={"ID":"da268ef7-cdee-4536-9504-1381de4ea636","Type":"ContainerStarted","Data":"84cda645bb182dc19496a81a1570b93c5fada5ef5ccb1d845b44a893bde54c87"} Mar 07 07:07:54 crc kubenswrapper[4941]: I0307 07:07:54.330369 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4k2z" event={"ID":"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa","Type":"ContainerStarted","Data":"4409b99776b04062680f073a76c39fef76cccc2fd2681f55ded01c4338978a72"} Mar 07 07:07:54 crc kubenswrapper[4941]: I0307 07:07:54.348058 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-njwjw" podStartSLOduration=1.884666921 podStartE2EDuration="4.348035022s" podCreationTimestamp="2026-03-07 07:07:50 +0000 UTC" firstStartedPulling="2026-03-07 07:07:51.299740351 +0000 UTC m=+968.252105816" lastFinishedPulling="2026-03-07 07:07:53.763108452 +0000 UTC m=+970.715473917" observedRunningTime="2026-03-07 07:07:54.34717124 +0000 UTC m=+971.299536715" watchObservedRunningTime="2026-03-07 07:07:54.348035022 +0000 UTC m=+971.300400497" Mar 07 07:07:55 crc kubenswrapper[4941]: I0307 07:07:55.337516 4941 generic.go:334] "Generic (PLEG): container finished" podID="556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" containerID="4409b99776b04062680f073a76c39fef76cccc2fd2681f55ded01c4338978a72" exitCode=0 Mar 07 07:07:55 crc kubenswrapper[4941]: I0307 07:07:55.339690 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4k2z" event={"ID":"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa","Type":"ContainerDied","Data":"4409b99776b04062680f073a76c39fef76cccc2fd2681f55ded01c4338978a72"} Mar 07 07:07:56 crc kubenswrapper[4941]: I0307 07:07:56.350726 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4k2z" event={"ID":"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa","Type":"ContainerStarted","Data":"f03db4ee702ab1bd8a58c01b5c2a51b716c39b0a28c3ba145cc044b667190b27"} Mar 07 07:07:56 crc kubenswrapper[4941]: I0307 07:07:56.373826 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c4k2z" podStartSLOduration=1.920667796 podStartE2EDuration="4.373799615s" podCreationTimestamp="2026-03-07 07:07:52 +0000 UTC" firstStartedPulling="2026-03-07 07:07:53.319513535 +0000 UTC m=+970.271879040" lastFinishedPulling="2026-03-07 07:07:55.772645394 +0000 UTC m=+972.725010859" observedRunningTime="2026-03-07 07:07:56.373233431 +0000 UTC m=+973.325598896" watchObservedRunningTime="2026-03-07 07:07:56.373799615 +0000 UTC m=+973.326165090" Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.136779 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547788-8s6c7"] Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.137798 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547788-8s6c7" Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.141776 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.141970 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.149746 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.173829 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547788-8s6c7"] Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.287146 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzpgq\" (UniqueName: \"kubernetes.io/projected/6f408644-d7ae-43ad-a056-fbe07aca78c1-kube-api-access-zzpgq\") pod \"auto-csr-approver-29547788-8s6c7\" (UID: \"6f408644-d7ae-43ad-a056-fbe07aca78c1\") " pod="openshift-infra/auto-csr-approver-29547788-8s6c7" Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.388516 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzpgq\" (UniqueName: \"kubernetes.io/projected/6f408644-d7ae-43ad-a056-fbe07aca78c1-kube-api-access-zzpgq\") pod \"auto-csr-approver-29547788-8s6c7\" (UID: \"6f408644-d7ae-43ad-a056-fbe07aca78c1\") " pod="openshift-infra/auto-csr-approver-29547788-8s6c7" Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.413430 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzpgq\" (UniqueName: \"kubernetes.io/projected/6f408644-d7ae-43ad-a056-fbe07aca78c1-kube-api-access-zzpgq\") pod \"auto-csr-approver-29547788-8s6c7\" (UID: \"6f408644-d7ae-43ad-a056-fbe07aca78c1\") " pod="openshift-infra/auto-csr-approver-29547788-8s6c7" Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.463152 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547788-8s6c7" Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.477733 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.477797 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.596143 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:08:00 crc kubenswrapper[4941]: I0307 07:08:00.701980 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547788-8s6c7"] Mar 07 07:08:01 crc kubenswrapper[4941]: I0307 07:08:01.385841 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547788-8s6c7" event={"ID":"6f408644-d7ae-43ad-a056-fbe07aca78c1","Type":"ContainerStarted","Data":"8289227ffbac8eccf127760c5b207fc28d1ee3ed929c49b3b6e47d5ec500bddb"} Mar 07 07:08:01 crc kubenswrapper[4941]: I0307 07:08:01.442375 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:08:02 crc kubenswrapper[4941]: I0307 07:08:02.395288 4941 generic.go:334] "Generic (PLEG): container finished" podID="6f408644-d7ae-43ad-a056-fbe07aca78c1" containerID="bb79d1d1306c286d1e351ed47763214d34ab77347dd00544da397bc6226f5eca" exitCode=0 Mar 07 07:08:02 crc kubenswrapper[4941]: I0307 07:08:02.395386 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547788-8s6c7" event={"ID":"6f408644-d7ae-43ad-a056-fbe07aca78c1","Type":"ContainerDied","Data":"bb79d1d1306c286d1e351ed47763214d34ab77347dd00544da397bc6226f5eca"} Mar 07 07:08:02 crc kubenswrapper[4941]: I0307 07:08:02.487703 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:08:02 crc kubenswrapper[4941]: I0307 07:08:02.488040 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:08:02 crc kubenswrapper[4941]: I0307 07:08:02.555298 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:08:02 crc kubenswrapper[4941]: I0307 07:08:02.950759 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-njwjw"] Mar 07 07:08:03 crc kubenswrapper[4941]: I0307 07:08:03.402528 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-njwjw" podUID="da268ef7-cdee-4536-9504-1381de4ea636" containerName="registry-server" containerID="cri-o://84cda645bb182dc19496a81a1570b93c5fada5ef5ccb1d845b44a893bde54c87" gracePeriod=2 Mar 07 07:08:03 crc kubenswrapper[4941]: I0307 07:08:03.459311 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:08:03 crc kubenswrapper[4941]: I0307 07:08:03.754587 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547788-8s6c7" Mar 07 07:08:03 crc kubenswrapper[4941]: I0307 07:08:03.840943 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzpgq\" (UniqueName: \"kubernetes.io/projected/6f408644-d7ae-43ad-a056-fbe07aca78c1-kube-api-access-zzpgq\") pod \"6f408644-d7ae-43ad-a056-fbe07aca78c1\" (UID: \"6f408644-d7ae-43ad-a056-fbe07aca78c1\") " Mar 07 07:08:03 crc kubenswrapper[4941]: I0307 07:08:03.848554 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f408644-d7ae-43ad-a056-fbe07aca78c1-kube-api-access-zzpgq" (OuterVolumeSpecName: "kube-api-access-zzpgq") pod "6f408644-d7ae-43ad-a056-fbe07aca78c1" (UID: "6f408644-d7ae-43ad-a056-fbe07aca78c1"). InnerVolumeSpecName "kube-api-access-zzpgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:08:03 crc kubenswrapper[4941]: I0307 07:08:03.853474 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:08:03 crc kubenswrapper[4941]: I0307 07:08:03.942900 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng4mg\" (UniqueName: \"kubernetes.io/projected/da268ef7-cdee-4536-9504-1381de4ea636-kube-api-access-ng4mg\") pod \"da268ef7-cdee-4536-9504-1381de4ea636\" (UID: \"da268ef7-cdee-4536-9504-1381de4ea636\") " Mar 07 07:08:03 crc kubenswrapper[4941]: I0307 07:08:03.942998 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da268ef7-cdee-4536-9504-1381de4ea636-utilities\") pod \"da268ef7-cdee-4536-9504-1381de4ea636\" (UID: \"da268ef7-cdee-4536-9504-1381de4ea636\") " Mar 07 07:08:03 crc kubenswrapper[4941]: I0307 07:08:03.943080 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da268ef7-cdee-4536-9504-1381de4ea636-catalog-content\") pod \"da268ef7-cdee-4536-9504-1381de4ea636\" (UID: \"da268ef7-cdee-4536-9504-1381de4ea636\") " Mar 07 07:08:03 crc kubenswrapper[4941]: I0307 07:08:03.943367 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzpgq\" (UniqueName: \"kubernetes.io/projected/6f408644-d7ae-43ad-a056-fbe07aca78c1-kube-api-access-zzpgq\") on node \"crc\" DevicePath \"\"" Mar 07 07:08:03 crc kubenswrapper[4941]: I0307 07:08:03.944273 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da268ef7-cdee-4536-9504-1381de4ea636-utilities" (OuterVolumeSpecName: "utilities") pod "da268ef7-cdee-4536-9504-1381de4ea636" (UID: "da268ef7-cdee-4536-9504-1381de4ea636"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:08:03 crc kubenswrapper[4941]: I0307 07:08:03.948068 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da268ef7-cdee-4536-9504-1381de4ea636-kube-api-access-ng4mg" (OuterVolumeSpecName: "kube-api-access-ng4mg") pod "da268ef7-cdee-4536-9504-1381de4ea636" (UID: "da268ef7-cdee-4536-9504-1381de4ea636"). InnerVolumeSpecName "kube-api-access-ng4mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.044846 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng4mg\" (UniqueName: \"kubernetes.io/projected/da268ef7-cdee-4536-9504-1381de4ea636-kube-api-access-ng4mg\") on node \"crc\" DevicePath \"\"" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.044883 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da268ef7-cdee-4536-9504-1381de4ea636-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.409449 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547788-8s6c7" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.409646 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547788-8s6c7" event={"ID":"6f408644-d7ae-43ad-a056-fbe07aca78c1","Type":"ContainerDied","Data":"8289227ffbac8eccf127760c5b207fc28d1ee3ed929c49b3b6e47d5ec500bddb"} Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.409708 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8289227ffbac8eccf127760c5b207fc28d1ee3ed929c49b3b6e47d5ec500bddb" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.414511 4941 generic.go:334] "Generic (PLEG): container finished" podID="da268ef7-cdee-4536-9504-1381de4ea636" containerID="84cda645bb182dc19496a81a1570b93c5fada5ef5ccb1d845b44a893bde54c87" exitCode=0 Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.414616 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-njwjw" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.414615 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njwjw" event={"ID":"da268ef7-cdee-4536-9504-1381de4ea636","Type":"ContainerDied","Data":"84cda645bb182dc19496a81a1570b93c5fada5ef5ccb1d845b44a893bde54c87"} Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.414705 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njwjw" event={"ID":"da268ef7-cdee-4536-9504-1381de4ea636","Type":"ContainerDied","Data":"b06ca5af17b0f524a44add003a6fba5c3f7de2e653d30d4c88cc07f69b2d61a3"} Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.414740 4941 scope.go:117] "RemoveContainer" containerID="84cda645bb182dc19496a81a1570b93c5fada5ef5ccb1d845b44a893bde54c87" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.441938 4941 scope.go:117] "RemoveContainer" containerID="3ce74c44302a799d519feabbea40dfe2796f663525836cdcff06f9ff5801f72c" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.469747 4941 scope.go:117] "RemoveContainer" containerID="e2b9e7ca06ffe1c81b265206c8eac0b4e72311449d08eab88dfd738a616536a5" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.491977 4941 scope.go:117] "RemoveContainer" containerID="84cda645bb182dc19496a81a1570b93c5fada5ef5ccb1d845b44a893bde54c87" Mar 07 07:08:04 crc kubenswrapper[4941]: E0307 07:08:04.495162 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84cda645bb182dc19496a81a1570b93c5fada5ef5ccb1d845b44a893bde54c87\": container with ID starting with 84cda645bb182dc19496a81a1570b93c5fada5ef5ccb1d845b44a893bde54c87 not found: ID does not exist" containerID="84cda645bb182dc19496a81a1570b93c5fada5ef5ccb1d845b44a893bde54c87" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.495338 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84cda645bb182dc19496a81a1570b93c5fada5ef5ccb1d845b44a893bde54c87"} err="failed to get container status \"84cda645bb182dc19496a81a1570b93c5fada5ef5ccb1d845b44a893bde54c87\": rpc error: code = NotFound desc = could not find container \"84cda645bb182dc19496a81a1570b93c5fada5ef5ccb1d845b44a893bde54c87\": container with ID starting with 84cda645bb182dc19496a81a1570b93c5fada5ef5ccb1d845b44a893bde54c87 not found: ID does not exist" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.495504 4941 scope.go:117] "RemoveContainer" containerID="3ce74c44302a799d519feabbea40dfe2796f663525836cdcff06f9ff5801f72c" Mar 07 07:08:04 crc kubenswrapper[4941]: E0307 07:08:04.495903 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce74c44302a799d519feabbea40dfe2796f663525836cdcff06f9ff5801f72c\": container with ID starting with 3ce74c44302a799d519feabbea40dfe2796f663525836cdcff06f9ff5801f72c not found: ID does not exist" containerID="3ce74c44302a799d519feabbea40dfe2796f663525836cdcff06f9ff5801f72c" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.495946 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce74c44302a799d519feabbea40dfe2796f663525836cdcff06f9ff5801f72c"} err="failed to get container status \"3ce74c44302a799d519feabbea40dfe2796f663525836cdcff06f9ff5801f72c\": rpc error: code = NotFound desc = could not find container \"3ce74c44302a799d519feabbea40dfe2796f663525836cdcff06f9ff5801f72c\": container with ID starting with 3ce74c44302a799d519feabbea40dfe2796f663525836cdcff06f9ff5801f72c not found: ID does not exist" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.495976 4941 scope.go:117] "RemoveContainer" containerID="e2b9e7ca06ffe1c81b265206c8eac0b4e72311449d08eab88dfd738a616536a5" Mar 07 07:08:04 crc kubenswrapper[4941]: E0307 07:08:04.496263 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b9e7ca06ffe1c81b265206c8eac0b4e72311449d08eab88dfd738a616536a5\": container with ID starting with e2b9e7ca06ffe1c81b265206c8eac0b4e72311449d08eab88dfd738a616536a5 not found: ID does not exist" containerID="e2b9e7ca06ffe1c81b265206c8eac0b4e72311449d08eab88dfd738a616536a5" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.496297 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b9e7ca06ffe1c81b265206c8eac0b4e72311449d08eab88dfd738a616536a5"} err="failed to get container status \"e2b9e7ca06ffe1c81b265206c8eac0b4e72311449d08eab88dfd738a616536a5\": rpc error: code = NotFound desc = could not find container \"e2b9e7ca06ffe1c81b265206c8eac0b4e72311449d08eab88dfd738a616536a5\": container with ID starting with e2b9e7ca06ffe1c81b265206c8eac0b4e72311449d08eab88dfd738a616536a5 not found: ID does not exist" Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.842329 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547782-kqxht"] Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.851394 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547782-kqxht"] Mar 07 07:08:04 crc kubenswrapper[4941]: I0307 07:08:04.964633 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da268ef7-cdee-4536-9504-1381de4ea636-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da268ef7-cdee-4536-9504-1381de4ea636" (UID: "da268ef7-cdee-4536-9504-1381de4ea636"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.049430 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-njwjw"] Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.058451 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da268ef7-cdee-4536-9504-1381de4ea636-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.061045 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-njwjw"] Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.351322 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4k2z"] Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.423159 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c4k2z" podUID="556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" containerName="registry-server" containerID="cri-o://f03db4ee702ab1bd8a58c01b5c2a51b716c39b0a28c3ba145cc044b667190b27" gracePeriod=2 Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.866902 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.961975 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ce6a206-b2ea-41bd-84f7-c7f1007b321c" path="/var/lib/kubelet/pods/8ce6a206-b2ea-41bd-84f7-c7f1007b321c/volumes" Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.962935 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da268ef7-cdee-4536-9504-1381de4ea636" path="/var/lib/kubelet/pods/da268ef7-cdee-4536-9504-1381de4ea636/volumes" Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.970679 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wldn\" (UniqueName: \"kubernetes.io/projected/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-kube-api-access-4wldn\") pod \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\" (UID: \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\") " Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.970757 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-utilities\") pod \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\" (UID: \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\") " Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.970791 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-catalog-content\") pod \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\" (UID: \"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa\") " Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.971529 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-utilities" (OuterVolumeSpecName: "utilities") pod "556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" (UID: "556f1f3c-e67b-4c55-a7e4-46ee9afcebfa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.976918 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-kube-api-access-4wldn" (OuterVolumeSpecName: "kube-api-access-4wldn") pod "556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" (UID: "556f1f3c-e67b-4c55-a7e4-46ee9afcebfa"). InnerVolumeSpecName "kube-api-access-4wldn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:08:05 crc kubenswrapper[4941]: I0307 07:08:05.998192 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" (UID: "556f1f3c-e67b-4c55-a7e4-46ee9afcebfa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.072722 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wldn\" (UniqueName: \"kubernetes.io/projected/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-kube-api-access-4wldn\") on node \"crc\" DevicePath \"\"" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.072776 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.072785 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.432146 4941 generic.go:334] "Generic (PLEG): container finished" podID="556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" containerID="f03db4ee702ab1bd8a58c01b5c2a51b716c39b0a28c3ba145cc044b667190b27" exitCode=0 Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.432195 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4k2z" event={"ID":"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa","Type":"ContainerDied","Data":"f03db4ee702ab1bd8a58c01b5c2a51b716c39b0a28c3ba145cc044b667190b27"} Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.432226 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4k2z" event={"ID":"556f1f3c-e67b-4c55-a7e4-46ee9afcebfa","Type":"ContainerDied","Data":"be2957c64af3b282c63e6574939e2e36c2be73d8c9d82ce736bf5af79b47731c"} Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.432244 4941 scope.go:117] "RemoveContainer" containerID="f03db4ee702ab1bd8a58c01b5c2a51b716c39b0a28c3ba145cc044b667190b27" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.432267 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4k2z" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.458449 4941 scope.go:117] "RemoveContainer" containerID="4409b99776b04062680f073a76c39fef76cccc2fd2681f55ded01c4338978a72" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.488629 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4k2z"] Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.499113 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4k2z"] Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.500733 4941 scope.go:117] "RemoveContainer" containerID="33485b3fb65d24741e450ac751008cfcafa1c19630dc5f11d4d387da429573a0" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.529123 4941 scope.go:117] "RemoveContainer" containerID="f03db4ee702ab1bd8a58c01b5c2a51b716c39b0a28c3ba145cc044b667190b27" Mar 07 07:08:06 crc kubenswrapper[4941]: E0307 07:08:06.529835 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f03db4ee702ab1bd8a58c01b5c2a51b716c39b0a28c3ba145cc044b667190b27\": container with ID starting with f03db4ee702ab1bd8a58c01b5c2a51b716c39b0a28c3ba145cc044b667190b27 not found: ID does not exist" containerID="f03db4ee702ab1bd8a58c01b5c2a51b716c39b0a28c3ba145cc044b667190b27" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.529886 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03db4ee702ab1bd8a58c01b5c2a51b716c39b0a28c3ba145cc044b667190b27"} err="failed to get container status \"f03db4ee702ab1bd8a58c01b5c2a51b716c39b0a28c3ba145cc044b667190b27\": rpc error: code = NotFound desc = could not find container \"f03db4ee702ab1bd8a58c01b5c2a51b716c39b0a28c3ba145cc044b667190b27\": container with ID starting with f03db4ee702ab1bd8a58c01b5c2a51b716c39b0a28c3ba145cc044b667190b27 not found: ID does not exist" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.529920 4941 scope.go:117] "RemoveContainer" containerID="4409b99776b04062680f073a76c39fef76cccc2fd2681f55ded01c4338978a72" Mar 07 07:08:06 crc kubenswrapper[4941]: E0307 07:08:06.530561 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4409b99776b04062680f073a76c39fef76cccc2fd2681f55ded01c4338978a72\": container with ID starting with 4409b99776b04062680f073a76c39fef76cccc2fd2681f55ded01c4338978a72 not found: ID does not exist" containerID="4409b99776b04062680f073a76c39fef76cccc2fd2681f55ded01c4338978a72" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.530607 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4409b99776b04062680f073a76c39fef76cccc2fd2681f55ded01c4338978a72"} err="failed to get container status \"4409b99776b04062680f073a76c39fef76cccc2fd2681f55ded01c4338978a72\": rpc error: code = NotFound desc = could not find container \"4409b99776b04062680f073a76c39fef76cccc2fd2681f55ded01c4338978a72\": container with ID starting with 4409b99776b04062680f073a76c39fef76cccc2fd2681f55ded01c4338978a72 not found: ID does not exist" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.530629 4941 scope.go:117] "RemoveContainer" containerID="33485b3fb65d24741e450ac751008cfcafa1c19630dc5f11d4d387da429573a0" Mar 07 07:08:06 crc kubenswrapper[4941]: E0307 07:08:06.531021 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33485b3fb65d24741e450ac751008cfcafa1c19630dc5f11d4d387da429573a0\": container with ID starting with 33485b3fb65d24741e450ac751008cfcafa1c19630dc5f11d4d387da429573a0 not found: ID does not exist" containerID="33485b3fb65d24741e450ac751008cfcafa1c19630dc5f11d4d387da429573a0" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.531075 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33485b3fb65d24741e450ac751008cfcafa1c19630dc5f11d4d387da429573a0"} err="failed to get container status \"33485b3fb65d24741e450ac751008cfcafa1c19630dc5f11d4d387da429573a0\": rpc error: code = NotFound desc = could not find container \"33485b3fb65d24741e450ac751008cfcafa1c19630dc5f11d4d387da429573a0\": container with ID starting with 33485b3fb65d24741e450ac751008cfcafa1c19630dc5f11d4d387da429573a0 not found: ID does not exist" Mar 07 07:08:06 crc kubenswrapper[4941]: I0307 07:08:06.683014 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5fcd57bf5c-fvjx6" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.428710 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-gdcv9"] Mar 07 07:08:07 crc kubenswrapper[4941]: E0307 07:08:07.428961 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da268ef7-cdee-4536-9504-1381de4ea636" containerName="registry-server" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.428977 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="da268ef7-cdee-4536-9504-1381de4ea636" containerName="registry-server" Mar 07 07:08:07 crc kubenswrapper[4941]: E0307 07:08:07.428991 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da268ef7-cdee-4536-9504-1381de4ea636" containerName="extract-utilities" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.428999 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="da268ef7-cdee-4536-9504-1381de4ea636" containerName="extract-utilities" Mar 07 07:08:07 crc kubenswrapper[4941]: E0307 07:08:07.429014 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" containerName="extract-utilities" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.429022 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" containerName="extract-utilities" Mar 07 07:08:07 crc kubenswrapper[4941]: E0307 07:08:07.429030 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" containerName="registry-server" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.429037 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" containerName="registry-server" Mar 07 07:08:07 crc kubenswrapper[4941]: E0307 07:08:07.429057 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f408644-d7ae-43ad-a056-fbe07aca78c1" containerName="oc" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.429064 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f408644-d7ae-43ad-a056-fbe07aca78c1" containerName="oc" Mar 07 07:08:07 crc kubenswrapper[4941]: E0307 07:08:07.429077 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da268ef7-cdee-4536-9504-1381de4ea636" containerName="extract-content" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.429084 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="da268ef7-cdee-4536-9504-1381de4ea636" containerName="extract-content" Mar 07 07:08:07 crc kubenswrapper[4941]: E0307 07:08:07.429097 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" containerName="extract-content" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.429104 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" containerName="extract-content" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.429222 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f408644-d7ae-43ad-a056-fbe07aca78c1" containerName="oc" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.429240 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="da268ef7-cdee-4536-9504-1381de4ea636" containerName="registry-server" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.429253 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" containerName="registry-server" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.431764 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.433750 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.434032 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-qt6gz" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.437929 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.453156 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb"] Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.454138 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.456662 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.470292 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb"] Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.536617 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-22x9t"] Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.537623 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-22x9t" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.539556 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.539958 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vsrvg" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.544737 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.545437 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-7xzpv"] Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.546526 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-7xzpv" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.547801 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.548459 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.554885 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-7xzpv"] Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.593977 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d571499c-14eb-495f-930d-9dafb0a3a093-metrics\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.595010 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gm7m\" (UniqueName: \"kubernetes.io/projected/d571499c-14eb-495f-930d-9dafb0a3a093-kube-api-access-2gm7m\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.595041 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d571499c-14eb-495f-930d-9dafb0a3a093-frr-sockets\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.595063 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d571499c-14eb-495f-930d-9dafb0a3a093-frr-conf\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.595076 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/477add90-db8d-449b-bc6f-45618a7e89f6-cert\") pod \"frr-k8s-webhook-server-7f989f654f-4f2fb\" (UID: \"477add90-db8d-449b-bc6f-45618a7e89f6\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.595094 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d571499c-14eb-495f-930d-9dafb0a3a093-frr-startup\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.595110 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d571499c-14eb-495f-930d-9dafb0a3a093-metrics-certs\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.595130 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggxw2\" (UniqueName: \"kubernetes.io/projected/477add90-db8d-449b-bc6f-45618a7e89f6-kube-api-access-ggxw2\") pod \"frr-k8s-webhook-server-7f989f654f-4f2fb\" (UID: \"477add90-db8d-449b-bc6f-45618a7e89f6\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.595186 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d571499c-14eb-495f-930d-9dafb0a3a093-reloader\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.696648 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d571499c-14eb-495f-930d-9dafb0a3a093-frr-conf\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.696705 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/477add90-db8d-449b-bc6f-45618a7e89f6-cert\") pod \"frr-k8s-webhook-server-7f989f654f-4f2fb\" (UID: \"477add90-db8d-449b-bc6f-45618a7e89f6\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.696740 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d571499c-14eb-495f-930d-9dafb0a3a093-frr-startup\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.696770 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d571499c-14eb-495f-930d-9dafb0a3a093-metrics-certs\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.696801 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/49a7425c-52fd-48b6-a2de-53dc8ab8c531-metallb-excludel2\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.696829 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bade6d7-3a37-4a66-b777-acddd50efb79-metrics-certs\") pod \"controller-86ddb6bd46-7xzpv\" (UID: \"7bade6d7-3a37-4a66-b777-acddd50efb79\") " pod="metallb-system/controller-86ddb6bd46-7xzpv" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.696865 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggxw2\" (UniqueName: \"kubernetes.io/projected/477add90-db8d-449b-bc6f-45618a7e89f6-kube-api-access-ggxw2\") pod \"frr-k8s-webhook-server-7f989f654f-4f2fb\" (UID: \"477add90-db8d-449b-bc6f-45618a7e89f6\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.696921 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49a7425c-52fd-48b6-a2de-53dc8ab8c531-metrics-certs\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.696973 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49a7425c-52fd-48b6-a2de-53dc8ab8c531-memberlist\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.697004 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d571499c-14eb-495f-930d-9dafb0a3a093-reloader\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.697032 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d571499c-14eb-495f-930d-9dafb0a3a093-metrics\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.697055 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7bade6d7-3a37-4a66-b777-acddd50efb79-cert\") pod \"controller-86ddb6bd46-7xzpv\" (UID: \"7bade6d7-3a37-4a66-b777-acddd50efb79\") " pod="metallb-system/controller-86ddb6bd46-7xzpv" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.697081 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd6g6\" (UniqueName: \"kubernetes.io/projected/7bade6d7-3a37-4a66-b777-acddd50efb79-kube-api-access-dd6g6\") pod \"controller-86ddb6bd46-7xzpv\" (UID: \"7bade6d7-3a37-4a66-b777-acddd50efb79\") " pod="metallb-system/controller-86ddb6bd46-7xzpv" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.697105 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhrv2\" (UniqueName: \"kubernetes.io/projected/49a7425c-52fd-48b6-a2de-53dc8ab8c531-kube-api-access-fhrv2\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.697134 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gm7m\" (UniqueName: \"kubernetes.io/projected/d571499c-14eb-495f-930d-9dafb0a3a093-kube-api-access-2gm7m\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.697163 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d571499c-14eb-495f-930d-9dafb0a3a093-frr-sockets\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: E0307 07:08:07.697575 4941 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 07 07:08:07 crc kubenswrapper[4941]: E0307 07:08:07.697661 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d571499c-14eb-495f-930d-9dafb0a3a093-metrics-certs podName:d571499c-14eb-495f-930d-9dafb0a3a093 nodeName:}" failed. No retries permitted until 2026-03-07 07:08:08.197638986 +0000 UTC m=+985.150004551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d571499c-14eb-495f-930d-9dafb0a3a093-metrics-certs") pod "frr-k8s-gdcv9" (UID: "d571499c-14eb-495f-930d-9dafb0a3a093") : secret "frr-k8s-certs-secret" not found Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.697717 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d571499c-14eb-495f-930d-9dafb0a3a093-frr-sockets\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.697827 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d571499c-14eb-495f-930d-9dafb0a3a093-reloader\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.698084 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d571499c-14eb-495f-930d-9dafb0a3a093-metrics\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.698368 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d571499c-14eb-495f-930d-9dafb0a3a093-frr-conf\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.698980 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d571499c-14eb-495f-930d-9dafb0a3a093-frr-startup\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.706236 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/477add90-db8d-449b-bc6f-45618a7e89f6-cert\") pod \"frr-k8s-webhook-server-7f989f654f-4f2fb\" (UID: \"477add90-db8d-449b-bc6f-45618a7e89f6\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.716424 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggxw2\" (UniqueName: \"kubernetes.io/projected/477add90-db8d-449b-bc6f-45618a7e89f6-kube-api-access-ggxw2\") pod \"frr-k8s-webhook-server-7f989f654f-4f2fb\" (UID: \"477add90-db8d-449b-bc6f-45618a7e89f6\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.717028 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gm7m\" (UniqueName: \"kubernetes.io/projected/d571499c-14eb-495f-930d-9dafb0a3a093-kube-api-access-2gm7m\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.769535 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.798154 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7bade6d7-3a37-4a66-b777-acddd50efb79-cert\") pod \"controller-86ddb6bd46-7xzpv\" (UID: \"7bade6d7-3a37-4a66-b777-acddd50efb79\") " pod="metallb-system/controller-86ddb6bd46-7xzpv" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.798220 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd6g6\" (UniqueName: \"kubernetes.io/projected/7bade6d7-3a37-4a66-b777-acddd50efb79-kube-api-access-dd6g6\") pod \"controller-86ddb6bd46-7xzpv\" (UID: \"7bade6d7-3a37-4a66-b777-acddd50efb79\") " pod="metallb-system/controller-86ddb6bd46-7xzpv" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.798249 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhrv2\" (UniqueName: \"kubernetes.io/projected/49a7425c-52fd-48b6-a2de-53dc8ab8c531-kube-api-access-fhrv2\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.798320 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/49a7425c-52fd-48b6-a2de-53dc8ab8c531-metallb-excludel2\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.798343 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bade6d7-3a37-4a66-b777-acddd50efb79-metrics-certs\") pod \"controller-86ddb6bd46-7xzpv\" (UID: \"7bade6d7-3a37-4a66-b777-acddd50efb79\") " pod="metallb-system/controller-86ddb6bd46-7xzpv" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.798379 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49a7425c-52fd-48b6-a2de-53dc8ab8c531-metrics-certs\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.798443 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49a7425c-52fd-48b6-a2de-53dc8ab8c531-memberlist\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:07 crc kubenswrapper[4941]: E0307 07:08:07.798750 4941 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 07 07:08:07 crc kubenswrapper[4941]: E0307 07:08:07.798832 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49a7425c-52fd-48b6-a2de-53dc8ab8c531-memberlist podName:49a7425c-52fd-48b6-a2de-53dc8ab8c531 nodeName:}" failed. No retries permitted until 2026-03-07 07:08:08.298813875 +0000 UTC m=+985.251179340 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/49a7425c-52fd-48b6-a2de-53dc8ab8c531-memberlist") pod "speaker-22x9t" (UID: "49a7425c-52fd-48b6-a2de-53dc8ab8c531") : secret "metallb-memberlist" not found Mar 07 07:08:07 crc kubenswrapper[4941]: E0307 07:08:07.798767 4941 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 07 07:08:07 crc kubenswrapper[4941]: E0307 07:08:07.799025 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49a7425c-52fd-48b6-a2de-53dc8ab8c531-metrics-certs podName:49a7425c-52fd-48b6-a2de-53dc8ab8c531 nodeName:}" failed. No retries permitted until 2026-03-07 07:08:08.29899948 +0000 UTC m=+985.251364945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49a7425c-52fd-48b6-a2de-53dc8ab8c531-metrics-certs") pod "speaker-22x9t" (UID: "49a7425c-52fd-48b6-a2de-53dc8ab8c531") : secret "speaker-certs-secret" not found Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.799655 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/49a7425c-52fd-48b6-a2de-53dc8ab8c531-metallb-excludel2\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.800515 4941 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.803260 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bade6d7-3a37-4a66-b777-acddd50efb79-metrics-certs\") pod \"controller-86ddb6bd46-7xzpv\" (UID: \"7bade6d7-3a37-4a66-b777-acddd50efb79\") " pod="metallb-system/controller-86ddb6bd46-7xzpv" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.815705 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7bade6d7-3a37-4a66-b777-acddd50efb79-cert\") pod \"controller-86ddb6bd46-7xzpv\" (UID: \"7bade6d7-3a37-4a66-b777-acddd50efb79\") " pod="metallb-system/controller-86ddb6bd46-7xzpv" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.815735 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhrv2\" (UniqueName: \"kubernetes.io/projected/49a7425c-52fd-48b6-a2de-53dc8ab8c531-kube-api-access-fhrv2\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.817795 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd6g6\" (UniqueName: \"kubernetes.io/projected/7bade6d7-3a37-4a66-b777-acddd50efb79-kube-api-access-dd6g6\") pod \"controller-86ddb6bd46-7xzpv\" (UID: \"7bade6d7-3a37-4a66-b777-acddd50efb79\") " pod="metallb-system/controller-86ddb6bd46-7xzpv" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.874915 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-7xzpv" Mar 07 07:08:07 crc kubenswrapper[4941]: I0307 07:08:07.963326 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="556f1f3c-e67b-4c55-a7e4-46ee9afcebfa" path="/var/lib/kubelet/pods/556f1f3c-e67b-4c55-a7e4-46ee9afcebfa/volumes" Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.089627 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-7xzpv"] Mar 07 07:08:08 crc kubenswrapper[4941]: W0307 07:08:08.095186 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bade6d7_3a37_4a66_b777_acddd50efb79.slice/crio-3c89ecbe22142261678f6b7d2db82109b6ee26fa636eecf46838511911cd9185 WatchSource:0}: Error finding container 3c89ecbe22142261678f6b7d2db82109b6ee26fa636eecf46838511911cd9185: Status 404 returned error can't find the container with id 3c89ecbe22142261678f6b7d2db82109b6ee26fa636eecf46838511911cd9185 Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.195446 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb"] Mar 07 07:08:08 crc kubenswrapper[4941]: W0307 07:08:08.200068 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod477add90_db8d_449b_bc6f_45618a7e89f6.slice/crio-c6367f0cb22493715173d5adfb88b805cac9ec76eafe04564b31c8aeda1826e0 WatchSource:0}: Error finding container c6367f0cb22493715173d5adfb88b805cac9ec76eafe04564b31c8aeda1826e0: Status 404 returned error can't find the container with id c6367f0cb22493715173d5adfb88b805cac9ec76eafe04564b31c8aeda1826e0 Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.206816 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d571499c-14eb-495f-930d-9dafb0a3a093-metrics-certs\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.211677 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d571499c-14eb-495f-930d-9dafb0a3a093-metrics-certs\") pod \"frr-k8s-gdcv9\" (UID: \"d571499c-14eb-495f-930d-9dafb0a3a093\") " pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.307875 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49a7425c-52fd-48b6-a2de-53dc8ab8c531-memberlist\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.307989 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49a7425c-52fd-48b6-a2de-53dc8ab8c531-metrics-certs\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.312558 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49a7425c-52fd-48b6-a2de-53dc8ab8c531-metrics-certs\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.312854 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49a7425c-52fd-48b6-a2de-53dc8ab8c531-memberlist\") pod \"speaker-22x9t\" (UID: \"49a7425c-52fd-48b6-a2de-53dc8ab8c531\") " pod="metallb-system/speaker-22x9t" Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.349205 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.446798 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb" event={"ID":"477add90-db8d-449b-bc6f-45618a7e89f6","Type":"ContainerStarted","Data":"c6367f0cb22493715173d5adfb88b805cac9ec76eafe04564b31c8aeda1826e0"} Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.448462 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-7xzpv" event={"ID":"7bade6d7-3a37-4a66-b777-acddd50efb79","Type":"ContainerStarted","Data":"682c88de867e710636fed3475454030f00302f1fcb7bbd5065c478218aae0218"} Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.448484 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-7xzpv" event={"ID":"7bade6d7-3a37-4a66-b777-acddd50efb79","Type":"ContainerStarted","Data":"a2d8fd1c9e685b2686a151ecc84ee9b613e09fcd53e39693b9f1d40dbe58174d"} Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.448493 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-7xzpv" event={"ID":"7bade6d7-3a37-4a66-b777-acddd50efb79","Type":"ContainerStarted","Data":"3c89ecbe22142261678f6b7d2db82109b6ee26fa636eecf46838511911cd9185"} Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.448780 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-7xzpv" Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.461260 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-22x9t" Mar 07 07:08:08 crc kubenswrapper[4941]: I0307 07:08:08.472360 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-7xzpv" podStartSLOduration=1.472335208 podStartE2EDuration="1.472335208s" podCreationTimestamp="2026-03-07 07:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:08:08.467069448 +0000 UTC m=+985.419434933" watchObservedRunningTime="2026-03-07 07:08:08.472335208 +0000 UTC m=+985.424700693" Mar 07 07:08:08 crc kubenswrapper[4941]: W0307 07:08:08.481554 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49a7425c_52fd_48b6_a2de_53dc8ab8c531.slice/crio-4f7cfaac425c10796d74c67264dde43349a6c9ee7355cf1fdee7a34f3347fadb WatchSource:0}: Error finding container 4f7cfaac425c10796d74c67264dde43349a6c9ee7355cf1fdee7a34f3347fadb: Status 404 returned error can't find the container with id 4f7cfaac425c10796d74c67264dde43349a6c9ee7355cf1fdee7a34f3347fadb Mar 07 07:08:09 crc kubenswrapper[4941]: I0307 07:08:09.455102 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-22x9t" event={"ID":"49a7425c-52fd-48b6-a2de-53dc8ab8c531","Type":"ContainerStarted","Data":"91144f9e11c572c6dd7108e70321b5ca90e4272a413624e91f6d6843c562fce4"} Mar 07 07:08:09 crc kubenswrapper[4941]: I0307 07:08:09.455191 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-22x9t" event={"ID":"49a7425c-52fd-48b6-a2de-53dc8ab8c531","Type":"ContainerStarted","Data":"e937f8c909840d609e543715b6fe0ea0dc07b5a7db5ff60ed279b10f84332fb2"} Mar 07 07:08:09 crc kubenswrapper[4941]: I0307 07:08:09.455201 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-22x9t" event={"ID":"49a7425c-52fd-48b6-a2de-53dc8ab8c531","Type":"ContainerStarted","Data":"4f7cfaac425c10796d74c67264dde43349a6c9ee7355cf1fdee7a34f3347fadb"} Mar 07 07:08:09 crc kubenswrapper[4941]: I0307 07:08:09.455366 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-22x9t" Mar 07 07:08:09 crc kubenswrapper[4941]: I0307 07:08:09.455863 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gdcv9" event={"ID":"d571499c-14eb-495f-930d-9dafb0a3a093","Type":"ContainerStarted","Data":"d31f9afcb013eb5eba433dbcaf3946d05679ddd7ff267c02d73febf269ce5a63"} Mar 07 07:08:09 crc kubenswrapper[4941]: I0307 07:08:09.470064 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-22x9t" podStartSLOduration=2.470045508 podStartE2EDuration="2.470045508s" podCreationTimestamp="2026-03-07 07:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:08:09.468302715 +0000 UTC m=+986.420668170" watchObservedRunningTime="2026-03-07 07:08:09.470045508 +0000 UTC m=+986.422410973" Mar 07 07:08:10 crc kubenswrapper[4941]: I0307 07:08:10.314742 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:08:10 crc kubenswrapper[4941]: I0307 07:08:10.314961 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:08:15 crc kubenswrapper[4941]: I0307 07:08:15.531633 4941 generic.go:334] "Generic (PLEG): container finished" podID="d571499c-14eb-495f-930d-9dafb0a3a093" containerID="280986508a4f7d4a4e46cb8bb4730faf85f4293753d4fb345fd41b54c586c888" exitCode=0 Mar 07 07:08:15 crc kubenswrapper[4941]: I0307 07:08:15.531830 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gdcv9" event={"ID":"d571499c-14eb-495f-930d-9dafb0a3a093","Type":"ContainerDied","Data":"280986508a4f7d4a4e46cb8bb4730faf85f4293753d4fb345fd41b54c586c888"} Mar 07 07:08:15 crc kubenswrapper[4941]: I0307 07:08:15.542772 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb" event={"ID":"477add90-db8d-449b-bc6f-45618a7e89f6","Type":"ContainerStarted","Data":"adffb34e13c957ceb5ed3e7434fd24fba5ba851556f44f1cafbf8572c427bcd4"} Mar 07 07:08:15 crc kubenswrapper[4941]: I0307 07:08:15.542908 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb" Mar 07 07:08:15 crc kubenswrapper[4941]: I0307 07:08:15.581323 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb" podStartSLOduration=1.6321786299999999 podStartE2EDuration="8.581298104s" podCreationTimestamp="2026-03-07 07:08:07 +0000 UTC" firstStartedPulling="2026-03-07 07:08:08.202517038 +0000 UTC m=+985.154882493" lastFinishedPulling="2026-03-07 07:08:15.151636502 +0000 UTC m=+992.104001967" observedRunningTime="2026-03-07 07:08:15.576740702 +0000 UTC m=+992.529106177" watchObservedRunningTime="2026-03-07 07:08:15.581298104 +0000 UTC m=+992.533663569" Mar 07 07:08:16 crc kubenswrapper[4941]: I0307 07:08:16.555767 4941 generic.go:334] "Generic (PLEG): container finished" podID="d571499c-14eb-495f-930d-9dafb0a3a093" containerID="fc29cc68226deecb46131db1a0f2dc15d73c99787bf73fc675507692ec2ea371" exitCode=0 Mar 07 07:08:16 crc kubenswrapper[4941]: I0307 07:08:16.555879 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gdcv9" event={"ID":"d571499c-14eb-495f-930d-9dafb0a3a093","Type":"ContainerDied","Data":"fc29cc68226deecb46131db1a0f2dc15d73c99787bf73fc675507692ec2ea371"} Mar 07 07:08:17 crc kubenswrapper[4941]: I0307 07:08:17.569645 4941 generic.go:334] "Generic (PLEG): container finished" podID="d571499c-14eb-495f-930d-9dafb0a3a093" containerID="b44cb2fd066f0e355552756cd4f1fd92c6e22f9c9ef2b50fc6247a2e4fb83812" exitCode=0 Mar 07 07:08:17 crc kubenswrapper[4941]: I0307 07:08:17.569892 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gdcv9" event={"ID":"d571499c-14eb-495f-930d-9dafb0a3a093","Type":"ContainerDied","Data":"b44cb2fd066f0e355552756cd4f1fd92c6e22f9c9ef2b50fc6247a2e4fb83812"} Mar 07 07:08:18 crc kubenswrapper[4941]: I0307 07:08:18.465900 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-22x9t" Mar 07 07:08:18 crc kubenswrapper[4941]: I0307 07:08:18.586140 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gdcv9" event={"ID":"d571499c-14eb-495f-930d-9dafb0a3a093","Type":"ContainerStarted","Data":"d6fad0f0c4ec16f90ee7a16ecb6c894f2f7bafacce172cbabb77e44aa492f1f4"} Mar 07 07:08:18 crc kubenswrapper[4941]: I0307 07:08:18.586189 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gdcv9" event={"ID":"d571499c-14eb-495f-930d-9dafb0a3a093","Type":"ContainerStarted","Data":"893bcfb8331a4421565d5f5578c875d8ba126f9df1e6d13f5b47e9c0ac7ee729"} Mar 07 07:08:18 crc kubenswrapper[4941]: I0307 07:08:18.586202 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gdcv9" event={"ID":"d571499c-14eb-495f-930d-9dafb0a3a093","Type":"ContainerStarted","Data":"7cd51cb1482894fa2cf2e1cad9ca69737629d708e271368f8cccc6682410845d"} Mar 07 07:08:18 crc kubenswrapper[4941]: I0307 07:08:18.586213 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gdcv9" event={"ID":"d571499c-14eb-495f-930d-9dafb0a3a093","Type":"ContainerStarted","Data":"22f6fa104f8c8bfc4f7e03ef7d9cdcee1c08a951a9542ad8d09cb21b4ae87edc"} Mar 07 07:08:18 crc kubenswrapper[4941]: I0307 07:08:18.586223 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gdcv9" event={"ID":"d571499c-14eb-495f-930d-9dafb0a3a093","Type":"ContainerStarted","Data":"ab7bf511517b76e0419a8a82ac8d5d482484e3e2d121b740f4dc1786eff7d17b"} Mar 07 07:08:19 crc kubenswrapper[4941]: I0307 07:08:19.599302 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gdcv9" event={"ID":"d571499c-14eb-495f-930d-9dafb0a3a093","Type":"ContainerStarted","Data":"e79e7414aaf5902aa314d743b0c81fc2638dee8ead946fc9bd2290c09ebe4d58"} Mar 07 07:08:19 crc kubenswrapper[4941]: I0307 07:08:19.599583 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:19 crc kubenswrapper[4941]: I0307 07:08:19.940075 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-gdcv9" podStartSLOduration=6.292709549 podStartE2EDuration="12.940043557s" podCreationTimestamp="2026-03-07 07:08:07 +0000 UTC" firstStartedPulling="2026-03-07 07:08:08.488483845 +0000 UTC m=+985.440849310" lastFinishedPulling="2026-03-07 07:08:15.135817863 +0000 UTC m=+992.088183318" observedRunningTime="2026-03-07 07:08:19.639785099 +0000 UTC m=+996.592150564" watchObservedRunningTime="2026-03-07 07:08:19.940043557 +0000 UTC m=+996.892409042" Mar 07 07:08:19 crc kubenswrapper[4941]: I0307 07:08:19.943316 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r"] Mar 07 07:08:19 crc kubenswrapper[4941]: I0307 07:08:19.944790 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" Mar 07 07:08:19 crc kubenswrapper[4941]: I0307 07:08:19.948937 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 07 07:08:19 crc kubenswrapper[4941]: I0307 07:08:19.952318 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r"] Mar 07 07:08:20 crc kubenswrapper[4941]: I0307 07:08:20.091113 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b9610f-6fe6-40b8-868f-0a834314b6c8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r\" (UID: \"25b9610f-6fe6-40b8-868f-0a834314b6c8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" Mar 07 07:08:20 crc kubenswrapper[4941]: I0307 07:08:20.091499 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b9610f-6fe6-40b8-868f-0a834314b6c8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r\" (UID: \"25b9610f-6fe6-40b8-868f-0a834314b6c8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" Mar 07 07:08:20 crc kubenswrapper[4941]: I0307 07:08:20.091685 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwhzj\" (UniqueName: \"kubernetes.io/projected/25b9610f-6fe6-40b8-868f-0a834314b6c8-kube-api-access-wwhzj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r\" (UID: \"25b9610f-6fe6-40b8-868f-0a834314b6c8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" Mar 07 07:08:20 crc kubenswrapper[4941]: I0307 07:08:20.193651 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b9610f-6fe6-40b8-868f-0a834314b6c8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r\" (UID: \"25b9610f-6fe6-40b8-868f-0a834314b6c8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" Mar 07 07:08:20 crc kubenswrapper[4941]: I0307 07:08:20.193743 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwhzj\" (UniqueName: \"kubernetes.io/projected/25b9610f-6fe6-40b8-868f-0a834314b6c8-kube-api-access-wwhzj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r\" (UID: \"25b9610f-6fe6-40b8-868f-0a834314b6c8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" Mar 07 07:08:20 crc kubenswrapper[4941]: I0307 07:08:20.193781 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b9610f-6fe6-40b8-868f-0a834314b6c8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r\" (UID: \"25b9610f-6fe6-40b8-868f-0a834314b6c8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" Mar 07 07:08:20 crc kubenswrapper[4941]: I0307 07:08:20.194600 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b9610f-6fe6-40b8-868f-0a834314b6c8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r\" (UID: \"25b9610f-6fe6-40b8-868f-0a834314b6c8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" Mar 07 07:08:20 crc kubenswrapper[4941]: I0307 07:08:20.194662 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b9610f-6fe6-40b8-868f-0a834314b6c8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r\" (UID: \"25b9610f-6fe6-40b8-868f-0a834314b6c8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" Mar 07 07:08:20 crc kubenswrapper[4941]: I0307 07:08:20.228052 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwhzj\" (UniqueName: \"kubernetes.io/projected/25b9610f-6fe6-40b8-868f-0a834314b6c8-kube-api-access-wwhzj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r\" (UID: \"25b9610f-6fe6-40b8-868f-0a834314b6c8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" Mar 07 07:08:20 crc kubenswrapper[4941]: I0307 07:08:20.266086 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" Mar 07 07:08:20 crc kubenswrapper[4941]: I0307 07:08:20.568621 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r"] Mar 07 07:08:20 crc kubenswrapper[4941]: I0307 07:08:20.606637 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" event={"ID":"25b9610f-6fe6-40b8-868f-0a834314b6c8","Type":"ContainerStarted","Data":"dd8729a58b725e9680022731c26785f06b486f465f0eb4b52dcca8aee1a8b2e0"} Mar 07 07:08:21 crc kubenswrapper[4941]: I0307 07:08:21.615092 4941 generic.go:334] "Generic (PLEG): container finished" podID="25b9610f-6fe6-40b8-868f-0a834314b6c8" containerID="58be27b8c9dc0eb7d5ca5aeb44b4830d3255daa9e854fbcb1a5728fe601d8105" exitCode=0 Mar 07 07:08:21 crc kubenswrapper[4941]: I0307 07:08:21.615346 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" event={"ID":"25b9610f-6fe6-40b8-868f-0a834314b6c8","Type":"ContainerDied","Data":"58be27b8c9dc0eb7d5ca5aeb44b4830d3255daa9e854fbcb1a5728fe601d8105"} Mar 07 07:08:23 crc kubenswrapper[4941]: I0307 07:08:23.350177 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:23 crc kubenswrapper[4941]: I0307 07:08:23.397354 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:25 crc kubenswrapper[4941]: I0307 07:08:25.644184 4941 generic.go:334] "Generic (PLEG): container finished" podID="25b9610f-6fe6-40b8-868f-0a834314b6c8" containerID="d09b762c197e6493524ca43c8a608b57ce5d6341e8b303db57a747631b4ea890" exitCode=0 Mar 07 07:08:25 crc kubenswrapper[4941]: I0307 07:08:25.644795 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" event={"ID":"25b9610f-6fe6-40b8-868f-0a834314b6c8","Type":"ContainerDied","Data":"d09b762c197e6493524ca43c8a608b57ce5d6341e8b303db57a747631b4ea890"} Mar 07 07:08:26 crc kubenswrapper[4941]: I0307 07:08:26.659696 4941 generic.go:334] "Generic (PLEG): container finished" podID="25b9610f-6fe6-40b8-868f-0a834314b6c8" containerID="ff0d1c813b89d94d63d18f45c721d3840f4b3549ce6aa81b4ec452b29bbd30a2" exitCode=0 Mar 07 07:08:26 crc kubenswrapper[4941]: I0307 07:08:26.659783 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" event={"ID":"25b9610f-6fe6-40b8-868f-0a834314b6c8","Type":"ContainerDied","Data":"ff0d1c813b89d94d63d18f45c721d3840f4b3549ce6aa81b4ec452b29bbd30a2"} Mar 07 07:08:27 crc kubenswrapper[4941]: I0307 07:08:27.780156 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-4f2fb" Mar 07 07:08:27 crc kubenswrapper[4941]: I0307 07:08:27.886835 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-7xzpv" Mar 07 07:08:27 crc kubenswrapper[4941]: I0307 07:08:27.930375 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" Mar 07 07:08:28 crc kubenswrapper[4941]: I0307 07:08:28.006318 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwhzj\" (UniqueName: \"kubernetes.io/projected/25b9610f-6fe6-40b8-868f-0a834314b6c8-kube-api-access-wwhzj\") pod \"25b9610f-6fe6-40b8-868f-0a834314b6c8\" (UID: \"25b9610f-6fe6-40b8-868f-0a834314b6c8\") " Mar 07 07:08:28 crc kubenswrapper[4941]: I0307 07:08:28.007851 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b9610f-6fe6-40b8-868f-0a834314b6c8-bundle\") pod \"25b9610f-6fe6-40b8-868f-0a834314b6c8\" (UID: \"25b9610f-6fe6-40b8-868f-0a834314b6c8\") " Mar 07 07:08:28 crc kubenswrapper[4941]: I0307 07:08:28.007945 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b9610f-6fe6-40b8-868f-0a834314b6c8-util\") pod \"25b9610f-6fe6-40b8-868f-0a834314b6c8\" (UID: \"25b9610f-6fe6-40b8-868f-0a834314b6c8\") " Mar 07 07:08:28 crc kubenswrapper[4941]: I0307 07:08:28.009432 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25b9610f-6fe6-40b8-868f-0a834314b6c8-bundle" (OuterVolumeSpecName: "bundle") pod "25b9610f-6fe6-40b8-868f-0a834314b6c8" (UID: "25b9610f-6fe6-40b8-868f-0a834314b6c8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:08:28 crc kubenswrapper[4941]: I0307 07:08:28.013504 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b9610f-6fe6-40b8-868f-0a834314b6c8-kube-api-access-wwhzj" (OuterVolumeSpecName: "kube-api-access-wwhzj") pod "25b9610f-6fe6-40b8-868f-0a834314b6c8" (UID: "25b9610f-6fe6-40b8-868f-0a834314b6c8"). InnerVolumeSpecName "kube-api-access-wwhzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:08:28 crc kubenswrapper[4941]: I0307 07:08:28.018346 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25b9610f-6fe6-40b8-868f-0a834314b6c8-util" (OuterVolumeSpecName: "util") pod "25b9610f-6fe6-40b8-868f-0a834314b6c8" (UID: "25b9610f-6fe6-40b8-868f-0a834314b6c8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:08:28 crc kubenswrapper[4941]: I0307 07:08:28.110313 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwhzj\" (UniqueName: \"kubernetes.io/projected/25b9610f-6fe6-40b8-868f-0a834314b6c8-kube-api-access-wwhzj\") on node \"crc\" DevicePath \"\"" Mar 07 07:08:28 crc kubenswrapper[4941]: I0307 07:08:28.110358 4941 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b9610f-6fe6-40b8-868f-0a834314b6c8-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:08:28 crc kubenswrapper[4941]: I0307 07:08:28.110366 4941 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b9610f-6fe6-40b8-868f-0a834314b6c8-util\") on node \"crc\" DevicePath \"\"" Mar 07 07:08:28 crc kubenswrapper[4941]: I0307 07:08:28.352771 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-gdcv9" Mar 07 07:08:28 crc kubenswrapper[4941]: I0307 07:08:28.674954 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" event={"ID":"25b9610f-6fe6-40b8-868f-0a834314b6c8","Type":"ContainerDied","Data":"dd8729a58b725e9680022731c26785f06b486f465f0eb4b52dcca8aee1a8b2e0"} Mar 07 07:08:28 crc kubenswrapper[4941]: I0307 07:08:28.675415 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd8729a58b725e9680022731c26785f06b486f465f0eb4b52dcca8aee1a8b2e0" Mar 07 07:08:28 crc kubenswrapper[4941]: I0307 07:08:28.675040 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.374118 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb"] Mar 07 07:08:33 crc kubenswrapper[4941]: E0307 07:08:33.374864 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b9610f-6fe6-40b8-868f-0a834314b6c8" containerName="util" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.374878 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b9610f-6fe6-40b8-868f-0a834314b6c8" containerName="util" Mar 07 07:08:33 crc kubenswrapper[4941]: E0307 07:08:33.374904 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b9610f-6fe6-40b8-868f-0a834314b6c8" containerName="extract" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.374909 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b9610f-6fe6-40b8-868f-0a834314b6c8" containerName="extract" Mar 07 07:08:33 crc kubenswrapper[4941]: E0307 07:08:33.374920 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b9610f-6fe6-40b8-868f-0a834314b6c8" containerName="pull" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.374927 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b9610f-6fe6-40b8-868f-0a834314b6c8" containerName="pull" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.375026 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="25b9610f-6fe6-40b8-868f-0a834314b6c8" containerName="extract" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.375474 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.382963 4941 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-phml6" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.383213 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.385068 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.406985 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb"] Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.481823 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq699\" (UniqueName: \"kubernetes.io/projected/ae13b656-bd75-4ccb-a6e9-11a7d48737b3-kube-api-access-dq699\") pod \"cert-manager-operator-controller-manager-66c8bdd694-wb7vb\" (UID: \"ae13b656-bd75-4ccb-a6e9-11a7d48737b3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.481980 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae13b656-bd75-4ccb-a6e9-11a7d48737b3-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-wb7vb\" (UID: \"ae13b656-bd75-4ccb-a6e9-11a7d48737b3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.583606 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq699\" (UniqueName: \"kubernetes.io/projected/ae13b656-bd75-4ccb-a6e9-11a7d48737b3-kube-api-access-dq699\") pod \"cert-manager-operator-controller-manager-66c8bdd694-wb7vb\" (UID: \"ae13b656-bd75-4ccb-a6e9-11a7d48737b3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.583703 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae13b656-bd75-4ccb-a6e9-11a7d48737b3-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-wb7vb\" (UID: \"ae13b656-bd75-4ccb-a6e9-11a7d48737b3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.584455 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae13b656-bd75-4ccb-a6e9-11a7d48737b3-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-wb7vb\" (UID: \"ae13b656-bd75-4ccb-a6e9-11a7d48737b3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.619514 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq699\" (UniqueName: \"kubernetes.io/projected/ae13b656-bd75-4ccb-a6e9-11a7d48737b3-kube-api-access-dq699\") pod \"cert-manager-operator-controller-manager-66c8bdd694-wb7vb\" (UID: \"ae13b656-bd75-4ccb-a6e9-11a7d48737b3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb" Mar 07 07:08:33 crc kubenswrapper[4941]: I0307 07:08:33.708721 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb" Mar 07 07:08:34 crc kubenswrapper[4941]: I0307 07:08:34.086489 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb"] Mar 07 07:08:34 crc kubenswrapper[4941]: W0307 07:08:34.099830 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae13b656_bd75_4ccb_a6e9_11a7d48737b3.slice/crio-da81f34507fa50644cc95d6e489dd7648fcbc302008a955c25098d39059bc858 WatchSource:0}: Error finding container da81f34507fa50644cc95d6e489dd7648fcbc302008a955c25098d39059bc858: Status 404 returned error can't find the container with id da81f34507fa50644cc95d6e489dd7648fcbc302008a955c25098d39059bc858 Mar 07 07:08:34 crc kubenswrapper[4941]: I0307 07:08:34.714255 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb" event={"ID":"ae13b656-bd75-4ccb-a6e9-11a7d48737b3","Type":"ContainerStarted","Data":"da81f34507fa50644cc95d6e489dd7648fcbc302008a955c25098d39059bc858"} Mar 07 07:08:37 crc kubenswrapper[4941]: I0307 07:08:37.753369 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb" event={"ID":"ae13b656-bd75-4ccb-a6e9-11a7d48737b3","Type":"ContainerStarted","Data":"b6cf30a65c73236f9c1479084429cad06ccc19bdf0156137c936d2eaf190ad4d"} Mar 07 07:08:37 crc kubenswrapper[4941]: I0307 07:08:37.783359 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-wb7vb" podStartSLOduration=1.861360658 podStartE2EDuration="4.783335307s" podCreationTimestamp="2026-03-07 07:08:33 +0000 UTC" firstStartedPulling="2026-03-07 07:08:34.102515915 +0000 UTC m=+1011.054881380" lastFinishedPulling="2026-03-07 07:08:37.024490564 +0000 UTC m=+1013.976856029" observedRunningTime="2026-03-07 07:08:37.776452147 +0000 UTC m=+1014.728817612" watchObservedRunningTime="2026-03-07 07:08:37.783335307 +0000 UTC m=+1014.735700782" Mar 07 07:08:40 crc kubenswrapper[4941]: I0307 07:08:40.314875 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:08:40 crc kubenswrapper[4941]: I0307 07:08:40.315486 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:08:40 crc kubenswrapper[4941]: I0307 07:08:40.315599 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 07:08:40 crc kubenswrapper[4941]: I0307 07:08:40.316434 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81c89fa64b6b91f6338e8315cd83a021b0214053cc3ad130bb16369071ad3bcf"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:08:40 crc kubenswrapper[4941]: I0307 07:08:40.316528 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://81c89fa64b6b91f6338e8315cd83a021b0214053cc3ad130bb16369071ad3bcf" gracePeriod=600 Mar 07 07:08:40 crc kubenswrapper[4941]: I0307 07:08:40.781572 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="81c89fa64b6b91f6338e8315cd83a021b0214053cc3ad130bb16369071ad3bcf" exitCode=0 Mar 07 07:08:40 crc kubenswrapper[4941]: I0307 07:08:40.781627 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"81c89fa64b6b91f6338e8315cd83a021b0214053cc3ad130bb16369071ad3bcf"} Mar 07 07:08:40 crc kubenswrapper[4941]: I0307 07:08:40.782031 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"8b286093b0ba04c8db409f2f8003244d432459a7a31a64eb7ee6e534880ca523"} Mar 07 07:08:40 crc kubenswrapper[4941]: I0307 07:08:40.782054 4941 scope.go:117] "RemoveContainer" containerID="b7fe83ca68d83b7b6bf2fa18e88a311d0a293429f704eef511a481cd353a9e5a" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.632766 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-pkt46"] Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.634115 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-pkt46" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.636268 4941 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-x7bj6" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.636291 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.636732 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.649445 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-pkt46"] Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.700753 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lwxdj"] Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.701726 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-lwxdj" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.705677 4941 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-n5lf8" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.710229 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lwxdj"] Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.813793 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfxm\" (UniqueName: \"kubernetes.io/projected/ffa02105-9624-412f-927b-52fce95120aa-kube-api-access-wlfxm\") pod \"cert-manager-cainjector-5545bd876-lwxdj\" (UID: \"ffa02105-9624-412f-927b-52fce95120aa\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lwxdj" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.814688 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72whk\" (UniqueName: \"kubernetes.io/projected/7b5f2700-be42-4711-b35c-5b90686ebe9a-kube-api-access-72whk\") pod \"cert-manager-webhook-6888856db4-pkt46\" (UID: \"7b5f2700-be42-4711-b35c-5b90686ebe9a\") " pod="cert-manager/cert-manager-webhook-6888856db4-pkt46" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.814744 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ffa02105-9624-412f-927b-52fce95120aa-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lwxdj\" (UID: \"ffa02105-9624-412f-927b-52fce95120aa\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lwxdj" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.814878 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b5f2700-be42-4711-b35c-5b90686ebe9a-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-pkt46\" (UID: \"7b5f2700-be42-4711-b35c-5b90686ebe9a\") " pod="cert-manager/cert-manager-webhook-6888856db4-pkt46" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.916044 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72whk\" (UniqueName: \"kubernetes.io/projected/7b5f2700-be42-4711-b35c-5b90686ebe9a-kube-api-access-72whk\") pod \"cert-manager-webhook-6888856db4-pkt46\" (UID: \"7b5f2700-be42-4711-b35c-5b90686ebe9a\") " pod="cert-manager/cert-manager-webhook-6888856db4-pkt46" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.916101 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ffa02105-9624-412f-927b-52fce95120aa-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lwxdj\" (UID: \"ffa02105-9624-412f-927b-52fce95120aa\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lwxdj" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.916155 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b5f2700-be42-4711-b35c-5b90686ebe9a-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-pkt46\" (UID: \"7b5f2700-be42-4711-b35c-5b90686ebe9a\") " pod="cert-manager/cert-manager-webhook-6888856db4-pkt46" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.916236 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfxm\" (UniqueName: \"kubernetes.io/projected/ffa02105-9624-412f-927b-52fce95120aa-kube-api-access-wlfxm\") pod \"cert-manager-cainjector-5545bd876-lwxdj\" (UID: \"ffa02105-9624-412f-927b-52fce95120aa\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lwxdj" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.943534 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72whk\" (UniqueName: \"kubernetes.io/projected/7b5f2700-be42-4711-b35c-5b90686ebe9a-kube-api-access-72whk\") pod \"cert-manager-webhook-6888856db4-pkt46\" (UID: \"7b5f2700-be42-4711-b35c-5b90686ebe9a\") " pod="cert-manager/cert-manager-webhook-6888856db4-pkt46" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.943564 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b5f2700-be42-4711-b35c-5b90686ebe9a-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-pkt46\" (UID: \"7b5f2700-be42-4711-b35c-5b90686ebe9a\") " pod="cert-manager/cert-manager-webhook-6888856db4-pkt46" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.943617 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ffa02105-9624-412f-927b-52fce95120aa-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lwxdj\" (UID: \"ffa02105-9624-412f-927b-52fce95120aa\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lwxdj" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.945525 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfxm\" (UniqueName: \"kubernetes.io/projected/ffa02105-9624-412f-927b-52fce95120aa-kube-api-access-wlfxm\") pod \"cert-manager-cainjector-5545bd876-lwxdj\" (UID: \"ffa02105-9624-412f-927b-52fce95120aa\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lwxdj" Mar 07 07:08:41 crc kubenswrapper[4941]: I0307 07:08:41.956935 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-pkt46" Mar 07 07:08:42 crc kubenswrapper[4941]: I0307 07:08:42.022693 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-lwxdj" Mar 07 07:08:42 crc kubenswrapper[4941]: I0307 07:08:42.285300 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lwxdj"] Mar 07 07:08:42 crc kubenswrapper[4941]: I0307 07:08:42.391533 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-pkt46"] Mar 07 07:08:42 crc kubenswrapper[4941]: W0307 07:08:42.392830 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b5f2700_be42_4711_b35c_5b90686ebe9a.slice/crio-409c1dfc10c41e8ecf26236b328df2190db2f9678f3b4fc9bbb49ef151f4f313 WatchSource:0}: Error finding container 409c1dfc10c41e8ecf26236b328df2190db2f9678f3b4fc9bbb49ef151f4f313: Status 404 returned error can't find the container with id 409c1dfc10c41e8ecf26236b328df2190db2f9678f3b4fc9bbb49ef151f4f313 Mar 07 07:08:42 crc kubenswrapper[4941]: I0307 07:08:42.799644 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-pkt46" event={"ID":"7b5f2700-be42-4711-b35c-5b90686ebe9a","Type":"ContainerStarted","Data":"409c1dfc10c41e8ecf26236b328df2190db2f9678f3b4fc9bbb49ef151f4f313"} Mar 07 07:08:42 crc kubenswrapper[4941]: I0307 07:08:42.800745 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-lwxdj" event={"ID":"ffa02105-9624-412f-927b-52fce95120aa","Type":"ContainerStarted","Data":"e4bb9428fef40c670fe866ec7a3a615b5044bae2950e574f92750fb0c716f56d"} Mar 07 07:08:47 crc kubenswrapper[4941]: I0307 07:08:47.851035 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-pkt46" event={"ID":"7b5f2700-be42-4711-b35c-5b90686ebe9a","Type":"ContainerStarted","Data":"28dfceba8286d76425b747328649cf97b3f189d3b5e881f51ccc46029364e087"} Mar 07 07:08:47 crc kubenswrapper[4941]: I0307 07:08:47.851714 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-pkt46" Mar 07 07:08:47 crc kubenswrapper[4941]: I0307 07:08:47.852726 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-lwxdj" event={"ID":"ffa02105-9624-412f-927b-52fce95120aa","Type":"ContainerStarted","Data":"c930978e47e0706818ff481dc10c7633af9a971a9fedb45ffa99b5a60e182687"} Mar 07 07:08:47 crc kubenswrapper[4941]: I0307 07:08:47.874768 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-pkt46" podStartSLOduration=2.210927731 podStartE2EDuration="6.874742841s" podCreationTimestamp="2026-03-07 07:08:41 +0000 UTC" firstStartedPulling="2026-03-07 07:08:42.396232265 +0000 UTC m=+1019.348597730" lastFinishedPulling="2026-03-07 07:08:47.060047355 +0000 UTC m=+1024.012412840" observedRunningTime="2026-03-07 07:08:47.869955723 +0000 UTC m=+1024.822321228" watchObservedRunningTime="2026-03-07 07:08:47.874742841 +0000 UTC m=+1024.827108316" Mar 07 07:08:47 crc kubenswrapper[4941]: I0307 07:08:47.887637 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-lwxdj" podStartSLOduration=2.056422849 podStartE2EDuration="6.887616378s" podCreationTimestamp="2026-03-07 07:08:41 +0000 UTC" firstStartedPulling="2026-03-07 07:08:42.297365972 +0000 UTC m=+1019.249731437" lastFinishedPulling="2026-03-07 07:08:47.128559501 +0000 UTC m=+1024.080924966" observedRunningTime="2026-03-07 07:08:47.88606232 +0000 UTC m=+1024.838427795" watchObservedRunningTime="2026-03-07 07:08:47.887616378 +0000 UTC m=+1024.839981843" Mar 07 07:08:56 crc kubenswrapper[4941]: I0307 07:08:56.960035 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-pkt46" Mar 07 07:09:00 crc kubenswrapper[4941]: I0307 07:09:00.390889 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-v96l6"] Mar 07 07:09:00 crc kubenswrapper[4941]: I0307 07:09:00.393776 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-v96l6" Mar 07 07:09:00 crc kubenswrapper[4941]: I0307 07:09:00.397925 4941 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-rzpph" Mar 07 07:09:00 crc kubenswrapper[4941]: I0307 07:09:00.411596 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-v96l6"] Mar 07 07:09:00 crc kubenswrapper[4941]: I0307 07:09:00.496376 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72309cf8-8de4-4863-be70-0d23cc50d0dc-bound-sa-token\") pod \"cert-manager-545d4d4674-v96l6\" (UID: \"72309cf8-8de4-4863-be70-0d23cc50d0dc\") " pod="cert-manager/cert-manager-545d4d4674-v96l6" Mar 07 07:09:00 crc kubenswrapper[4941]: I0307 07:09:00.496481 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2q64\" (UniqueName: \"kubernetes.io/projected/72309cf8-8de4-4863-be70-0d23cc50d0dc-kube-api-access-b2q64\") pod \"cert-manager-545d4d4674-v96l6\" (UID: \"72309cf8-8de4-4863-be70-0d23cc50d0dc\") " pod="cert-manager/cert-manager-545d4d4674-v96l6" Mar 07 07:09:00 crc kubenswrapper[4941]: I0307 07:09:00.598125 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2q64\" (UniqueName: \"kubernetes.io/projected/72309cf8-8de4-4863-be70-0d23cc50d0dc-kube-api-access-b2q64\") pod \"cert-manager-545d4d4674-v96l6\" (UID: \"72309cf8-8de4-4863-be70-0d23cc50d0dc\") " pod="cert-manager/cert-manager-545d4d4674-v96l6" Mar 07 07:09:00 crc kubenswrapper[4941]: I0307 07:09:00.598225 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72309cf8-8de4-4863-be70-0d23cc50d0dc-bound-sa-token\") pod \"cert-manager-545d4d4674-v96l6\" (UID: \"72309cf8-8de4-4863-be70-0d23cc50d0dc\") " pod="cert-manager/cert-manager-545d4d4674-v96l6" Mar 07 07:09:00 crc kubenswrapper[4941]: I0307 07:09:00.633301 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72309cf8-8de4-4863-be70-0d23cc50d0dc-bound-sa-token\") pod \"cert-manager-545d4d4674-v96l6\" (UID: \"72309cf8-8de4-4863-be70-0d23cc50d0dc\") " pod="cert-manager/cert-manager-545d4d4674-v96l6" Mar 07 07:09:00 crc kubenswrapper[4941]: I0307 07:09:00.633534 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2q64\" (UniqueName: \"kubernetes.io/projected/72309cf8-8de4-4863-be70-0d23cc50d0dc-kube-api-access-b2q64\") pod \"cert-manager-545d4d4674-v96l6\" (UID: \"72309cf8-8de4-4863-be70-0d23cc50d0dc\") " pod="cert-manager/cert-manager-545d4d4674-v96l6" Mar 07 07:09:00 crc kubenswrapper[4941]: I0307 07:09:00.720076 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-v96l6" Mar 07 07:09:01 crc kubenswrapper[4941]: I0307 07:09:01.128323 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-v96l6"] Mar 07 07:09:01 crc kubenswrapper[4941]: I0307 07:09:01.968516 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-v96l6" event={"ID":"72309cf8-8de4-4863-be70-0d23cc50d0dc","Type":"ContainerStarted","Data":"375ffc4ce14a070755fcd79cbe8f391903c6bdfb856c5b40bc7f33b7cdd00c98"} Mar 07 07:09:01 crc kubenswrapper[4941]: I0307 07:09:01.968613 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-v96l6" event={"ID":"72309cf8-8de4-4863-be70-0d23cc50d0dc","Type":"ContainerStarted","Data":"983149b4b536f731fdc1c6a7646b25d11c20de1ec256af258754ea30c09ed38e"} Mar 07 07:09:01 crc kubenswrapper[4941]: I0307 07:09:01.983875 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-v96l6" podStartSLOduration=1.983849296 podStartE2EDuration="1.983849296s" podCreationTimestamp="2026-03-07 07:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:09:01.979325885 +0000 UTC m=+1038.931691370" watchObservedRunningTime="2026-03-07 07:09:01.983849296 +0000 UTC m=+1038.936214781" Mar 07 07:09:02 crc kubenswrapper[4941]: I0307 07:09:02.687868 4941 scope.go:117] "RemoveContainer" containerID="fec4a1c5879c5bc7d04268a2f315e08c10492170ef27522b5f50cb4ab4b9a1dd" Mar 07 07:09:10 crc kubenswrapper[4941]: I0307 07:09:10.613642 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ttsn5"] Mar 07 07:09:10 crc kubenswrapper[4941]: I0307 07:09:10.615482 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ttsn5" Mar 07 07:09:10 crc kubenswrapper[4941]: I0307 07:09:10.620849 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 07 07:09:10 crc kubenswrapper[4941]: I0307 07:09:10.620851 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 07 07:09:10 crc kubenswrapper[4941]: I0307 07:09:10.620932 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-zwpnw" Mar 07 07:09:10 crc kubenswrapper[4941]: I0307 07:09:10.652232 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ttsn5"] Mar 07 07:09:10 crc kubenswrapper[4941]: I0307 07:09:10.753388 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt65w\" (UniqueName: \"kubernetes.io/projected/f4e36f23-e2b5-4f38-9c14-74559d31eaf0-kube-api-access-pt65w\") pod \"openstack-operator-index-ttsn5\" (UID: \"f4e36f23-e2b5-4f38-9c14-74559d31eaf0\") " pod="openstack-operators/openstack-operator-index-ttsn5" Mar 07 07:09:10 crc kubenswrapper[4941]: I0307 07:09:10.854937 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt65w\" (UniqueName: \"kubernetes.io/projected/f4e36f23-e2b5-4f38-9c14-74559d31eaf0-kube-api-access-pt65w\") pod \"openstack-operator-index-ttsn5\" (UID: \"f4e36f23-e2b5-4f38-9c14-74559d31eaf0\") " pod="openstack-operators/openstack-operator-index-ttsn5" Mar 07 07:09:10 crc kubenswrapper[4941]: I0307 07:09:10.875091 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt65w\" (UniqueName: \"kubernetes.io/projected/f4e36f23-e2b5-4f38-9c14-74559d31eaf0-kube-api-access-pt65w\") pod \"openstack-operator-index-ttsn5\" (UID: \"f4e36f23-e2b5-4f38-9c14-74559d31eaf0\") " pod="openstack-operators/openstack-operator-index-ttsn5" Mar 07 07:09:10 crc kubenswrapper[4941]: I0307 07:09:10.933318 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ttsn5" Mar 07 07:09:11 crc kubenswrapper[4941]: I0307 07:09:11.371106 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ttsn5"] Mar 07 07:09:11 crc kubenswrapper[4941]: W0307 07:09:11.389678 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4e36f23_e2b5_4f38_9c14_74559d31eaf0.slice/crio-43e548167e200083e62eeee60a5fdeb1b391704ba2be1d7c61f8c6c6409b38fb WatchSource:0}: Error finding container 43e548167e200083e62eeee60a5fdeb1b391704ba2be1d7c61f8c6c6409b38fb: Status 404 returned error can't find the container with id 43e548167e200083e62eeee60a5fdeb1b391704ba2be1d7c61f8c6c6409b38fb Mar 07 07:09:12 crc kubenswrapper[4941]: I0307 07:09:12.033136 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ttsn5" event={"ID":"f4e36f23-e2b5-4f38-9c14-74559d31eaf0","Type":"ContainerStarted","Data":"43e548167e200083e62eeee60a5fdeb1b391704ba2be1d7c61f8c6c6409b38fb"} Mar 07 07:09:13 crc kubenswrapper[4941]: I0307 07:09:13.045030 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ttsn5" event={"ID":"f4e36f23-e2b5-4f38-9c14-74559d31eaf0","Type":"ContainerStarted","Data":"e0887bb1e72e972807dc83e1a77cde91d72b37bf6e36370e30aae18793c78c8b"} Mar 07 07:09:13 crc kubenswrapper[4941]: I0307 07:09:13.069483 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ttsn5" podStartSLOduration=2.2678017280000002 podStartE2EDuration="3.069451114s" podCreationTimestamp="2026-03-07 07:09:10 +0000 UTC" firstStartedPulling="2026-03-07 07:09:11.39573267 +0000 UTC m=+1048.348098135" lastFinishedPulling="2026-03-07 07:09:12.197382046 +0000 UTC m=+1049.149747521" observedRunningTime="2026-03-07 07:09:13.067884536 +0000 UTC m=+1050.020250021" watchObservedRunningTime="2026-03-07 07:09:13.069451114 +0000 UTC m=+1050.021816579" Mar 07 07:09:13 crc kubenswrapper[4941]: I0307 07:09:13.978053 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ttsn5"] Mar 07 07:09:14 crc kubenswrapper[4941]: I0307 07:09:14.578804 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-t85z5"] Mar 07 07:09:14 crc kubenswrapper[4941]: I0307 07:09:14.580610 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t85z5" Mar 07 07:09:14 crc kubenswrapper[4941]: I0307 07:09:14.585863 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t85z5"] Mar 07 07:09:14 crc kubenswrapper[4941]: I0307 07:09:14.611962 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skflc\" (UniqueName: \"kubernetes.io/projected/37ec2495-f5bc-49d3-81e8-3fa8a0bea8d3-kube-api-access-skflc\") pod \"openstack-operator-index-t85z5\" (UID: \"37ec2495-f5bc-49d3-81e8-3fa8a0bea8d3\") " pod="openstack-operators/openstack-operator-index-t85z5" Mar 07 07:09:14 crc kubenswrapper[4941]: I0307 07:09:14.713170 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skflc\" (UniqueName: \"kubernetes.io/projected/37ec2495-f5bc-49d3-81e8-3fa8a0bea8d3-kube-api-access-skflc\") pod \"openstack-operator-index-t85z5\" (UID: \"37ec2495-f5bc-49d3-81e8-3fa8a0bea8d3\") " pod="openstack-operators/openstack-operator-index-t85z5" Mar 07 07:09:14 crc kubenswrapper[4941]: I0307 07:09:14.735518 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skflc\" (UniqueName: \"kubernetes.io/projected/37ec2495-f5bc-49d3-81e8-3fa8a0bea8d3-kube-api-access-skflc\") pod \"openstack-operator-index-t85z5\" (UID: \"37ec2495-f5bc-49d3-81e8-3fa8a0bea8d3\") " pod="openstack-operators/openstack-operator-index-t85z5" Mar 07 07:09:14 crc kubenswrapper[4941]: I0307 07:09:14.906650 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t85z5" Mar 07 07:09:15 crc kubenswrapper[4941]: I0307 07:09:15.063726 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-ttsn5" podUID="f4e36f23-e2b5-4f38-9c14-74559d31eaf0" containerName="registry-server" containerID="cri-o://e0887bb1e72e972807dc83e1a77cde91d72b37bf6e36370e30aae18793c78c8b" gracePeriod=2 Mar 07 07:09:15 crc kubenswrapper[4941]: I0307 07:09:15.343318 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t85z5"] Mar 07 07:09:15 crc kubenswrapper[4941]: I0307 07:09:15.395956 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ttsn5" Mar 07 07:09:15 crc kubenswrapper[4941]: I0307 07:09:15.524150 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt65w\" (UniqueName: \"kubernetes.io/projected/f4e36f23-e2b5-4f38-9c14-74559d31eaf0-kube-api-access-pt65w\") pod \"f4e36f23-e2b5-4f38-9c14-74559d31eaf0\" (UID: \"f4e36f23-e2b5-4f38-9c14-74559d31eaf0\") " Mar 07 07:09:15 crc kubenswrapper[4941]: I0307 07:09:15.533459 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e36f23-e2b5-4f38-9c14-74559d31eaf0-kube-api-access-pt65w" (OuterVolumeSpecName: "kube-api-access-pt65w") pod "f4e36f23-e2b5-4f38-9c14-74559d31eaf0" (UID: "f4e36f23-e2b5-4f38-9c14-74559d31eaf0"). InnerVolumeSpecName "kube-api-access-pt65w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:09:15 crc kubenswrapper[4941]: I0307 07:09:15.626308 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt65w\" (UniqueName: \"kubernetes.io/projected/f4e36f23-e2b5-4f38-9c14-74559d31eaf0-kube-api-access-pt65w\") on node \"crc\" DevicePath \"\"" Mar 07 07:09:16 crc kubenswrapper[4941]: I0307 07:09:16.071713 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t85z5" event={"ID":"37ec2495-f5bc-49d3-81e8-3fa8a0bea8d3","Type":"ContainerStarted","Data":"550a0148e022319fd0e1f79df1203c0d6148fe8ca7a6341ef1412ebe67d4026c"} Mar 07 07:09:16 crc kubenswrapper[4941]: I0307 07:09:16.071785 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t85z5" event={"ID":"37ec2495-f5bc-49d3-81e8-3fa8a0bea8d3","Type":"ContainerStarted","Data":"f1c8af9a3df264b03f7fd3e70bac61ecaebcc2007e2a9fa3fb092a7dd4bf2ba7"} Mar 07 07:09:16 crc kubenswrapper[4941]: I0307 07:09:16.074803 4941 generic.go:334] "Generic (PLEG): container finished" podID="f4e36f23-e2b5-4f38-9c14-74559d31eaf0" containerID="e0887bb1e72e972807dc83e1a77cde91d72b37bf6e36370e30aae18793c78c8b" exitCode=0 Mar 07 07:09:16 crc kubenswrapper[4941]: I0307 07:09:16.074880 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ttsn5" Mar 07 07:09:16 crc kubenswrapper[4941]: I0307 07:09:16.074971 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ttsn5" event={"ID":"f4e36f23-e2b5-4f38-9c14-74559d31eaf0","Type":"ContainerDied","Data":"e0887bb1e72e972807dc83e1a77cde91d72b37bf6e36370e30aae18793c78c8b"} Mar 07 07:09:16 crc kubenswrapper[4941]: I0307 07:09:16.075071 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ttsn5" event={"ID":"f4e36f23-e2b5-4f38-9c14-74559d31eaf0","Type":"ContainerDied","Data":"43e548167e200083e62eeee60a5fdeb1b391704ba2be1d7c61f8c6c6409b38fb"} Mar 07 07:09:16 crc kubenswrapper[4941]: I0307 07:09:16.075136 4941 scope.go:117] "RemoveContainer" containerID="e0887bb1e72e972807dc83e1a77cde91d72b37bf6e36370e30aae18793c78c8b" Mar 07 07:09:16 crc kubenswrapper[4941]: I0307 07:09:16.095937 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-t85z5" podStartSLOduration=1.6643757460000002 podStartE2EDuration="2.095861503s" podCreationTimestamp="2026-03-07 07:09:14 +0000 UTC" firstStartedPulling="2026-03-07 07:09:15.355127057 +0000 UTC m=+1052.307492522" lastFinishedPulling="2026-03-07 07:09:15.786612804 +0000 UTC m=+1052.738978279" observedRunningTime="2026-03-07 07:09:16.091823804 +0000 UTC m=+1053.044189299" watchObservedRunningTime="2026-03-07 07:09:16.095861503 +0000 UTC m=+1053.048227008" Mar 07 07:09:16 crc kubenswrapper[4941]: I0307 07:09:16.107905 4941 scope.go:117] "RemoveContainer" containerID="e0887bb1e72e972807dc83e1a77cde91d72b37bf6e36370e30aae18793c78c8b" Mar 07 07:09:16 crc kubenswrapper[4941]: E0307 07:09:16.108834 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0887bb1e72e972807dc83e1a77cde91d72b37bf6e36370e30aae18793c78c8b\": container with ID starting with e0887bb1e72e972807dc83e1a77cde91d72b37bf6e36370e30aae18793c78c8b not found: ID does not exist" containerID="e0887bb1e72e972807dc83e1a77cde91d72b37bf6e36370e30aae18793c78c8b" Mar 07 07:09:16 crc kubenswrapper[4941]: I0307 07:09:16.108924 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0887bb1e72e972807dc83e1a77cde91d72b37bf6e36370e30aae18793c78c8b"} err="failed to get container status \"e0887bb1e72e972807dc83e1a77cde91d72b37bf6e36370e30aae18793c78c8b\": rpc error: code = NotFound desc = could not find container \"e0887bb1e72e972807dc83e1a77cde91d72b37bf6e36370e30aae18793c78c8b\": container with ID starting with e0887bb1e72e972807dc83e1a77cde91d72b37bf6e36370e30aae18793c78c8b not found: ID does not exist" Mar 07 07:09:16 crc kubenswrapper[4941]: I0307 07:09:16.121196 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ttsn5"] Mar 07 07:09:16 crc kubenswrapper[4941]: I0307 07:09:16.129206 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-ttsn5"] Mar 07 07:09:17 crc kubenswrapper[4941]: I0307 07:09:17.966785 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e36f23-e2b5-4f38-9c14-74559d31eaf0" path="/var/lib/kubelet/pods/f4e36f23-e2b5-4f38-9c14-74559d31eaf0/volumes" Mar 07 07:09:24 crc kubenswrapper[4941]: I0307 07:09:24.907657 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-t85z5" Mar 07 07:09:24 crc kubenswrapper[4941]: I0307 07:09:24.909239 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-t85z5" Mar 07 07:09:24 crc kubenswrapper[4941]: I0307 07:09:24.936737 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-t85z5" Mar 07 07:09:25 crc kubenswrapper[4941]: I0307 07:09:25.197240 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-t85z5" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.439503 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5"] Mar 07 07:09:27 crc kubenswrapper[4941]: E0307 07:09:27.441205 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e36f23-e2b5-4f38-9c14-74559d31eaf0" containerName="registry-server" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.441289 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e36f23-e2b5-4f38-9c14-74559d31eaf0" containerName="registry-server" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.441500 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e36f23-e2b5-4f38-9c14-74559d31eaf0" containerName="registry-server" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.442453 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.445629 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-bqrvj" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.455493 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5"] Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.609257 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r8ms\" (UniqueName: \"kubernetes.io/projected/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-kube-api-access-9r8ms\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5\" (UID: \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.609316 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5\" (UID: \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.609357 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5\" (UID: \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.710661 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r8ms\" (UniqueName: \"kubernetes.io/projected/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-kube-api-access-9r8ms\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5\" (UID: \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.710726 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5\" (UID: \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.710765 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5\" (UID: \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.711295 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5\" (UID: \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.711366 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5\" (UID: \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.731201 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r8ms\" (UniqueName: \"kubernetes.io/projected/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-kube-api-access-9r8ms\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5\" (UID: \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" Mar 07 07:09:27 crc kubenswrapper[4941]: I0307 07:09:27.763423 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" Mar 07 07:09:28 crc kubenswrapper[4941]: I0307 07:09:28.228057 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5"] Mar 07 07:09:29 crc kubenswrapper[4941]: I0307 07:09:29.193940 4941 generic.go:334] "Generic (PLEG): container finished" podID="8a078fed-092e-4d8e-8f31-2c1d6fb0ea10" containerID="e1d685274b60161b8319ebd0db737c95a205a8305a2cbf80139ec6b32ed1c7b4" exitCode=0 Mar 07 07:09:29 crc kubenswrapper[4941]: I0307 07:09:29.194038 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" event={"ID":"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10","Type":"ContainerDied","Data":"e1d685274b60161b8319ebd0db737c95a205a8305a2cbf80139ec6b32ed1c7b4"} Mar 07 07:09:29 crc kubenswrapper[4941]: I0307 07:09:29.194417 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" event={"ID":"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10","Type":"ContainerStarted","Data":"c591f5548cc01a338c59b416c59482c0cdcfbd2bb4c4530121a374b9757dc1ff"} Mar 07 07:09:30 crc kubenswrapper[4941]: I0307 07:09:30.201255 4941 generic.go:334] "Generic (PLEG): container finished" podID="8a078fed-092e-4d8e-8f31-2c1d6fb0ea10" containerID="0deff773d47a6ee55d6270d81db4d3b112206c5dca826f387e47d68906f6e531" exitCode=0 Mar 07 07:09:30 crc kubenswrapper[4941]: I0307 07:09:30.201305 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" event={"ID":"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10","Type":"ContainerDied","Data":"0deff773d47a6ee55d6270d81db4d3b112206c5dca826f387e47d68906f6e531"} Mar 07 07:09:31 crc kubenswrapper[4941]: I0307 07:09:31.215662 4941 generic.go:334] "Generic (PLEG): container finished" podID="8a078fed-092e-4d8e-8f31-2c1d6fb0ea10" containerID="02cf3d6a2f3a24d8361071f755944b0fc51e4f55ca4aebcf40c5970417a53029" exitCode=0 Mar 07 07:09:31 crc kubenswrapper[4941]: I0307 07:09:31.215797 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" event={"ID":"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10","Type":"ContainerDied","Data":"02cf3d6a2f3a24d8361071f755944b0fc51e4f55ca4aebcf40c5970417a53029"} Mar 07 07:09:32 crc kubenswrapper[4941]: I0307 07:09:32.555899 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" Mar 07 07:09:32 crc kubenswrapper[4941]: I0307 07:09:32.686397 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-bundle\") pod \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\" (UID: \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\") " Mar 07 07:09:32 crc kubenswrapper[4941]: I0307 07:09:32.686624 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-util\") pod \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\" (UID: \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\") " Mar 07 07:09:32 crc kubenswrapper[4941]: I0307 07:09:32.686663 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r8ms\" (UniqueName: \"kubernetes.io/projected/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-kube-api-access-9r8ms\") pod \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\" (UID: \"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10\") " Mar 07 07:09:32 crc kubenswrapper[4941]: I0307 07:09:32.687807 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-bundle" (OuterVolumeSpecName: "bundle") pod "8a078fed-092e-4d8e-8f31-2c1d6fb0ea10" (UID: "8a078fed-092e-4d8e-8f31-2c1d6fb0ea10"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:09:32 crc kubenswrapper[4941]: I0307 07:09:32.693898 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-kube-api-access-9r8ms" (OuterVolumeSpecName: "kube-api-access-9r8ms") pod "8a078fed-092e-4d8e-8f31-2c1d6fb0ea10" (UID: "8a078fed-092e-4d8e-8f31-2c1d6fb0ea10"). InnerVolumeSpecName "kube-api-access-9r8ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:09:32 crc kubenswrapper[4941]: I0307 07:09:32.705686 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-util" (OuterVolumeSpecName: "util") pod "8a078fed-092e-4d8e-8f31-2c1d6fb0ea10" (UID: "8a078fed-092e-4d8e-8f31-2c1d6fb0ea10"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:09:32 crc kubenswrapper[4941]: I0307 07:09:32.788436 4941 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:09:32 crc kubenswrapper[4941]: I0307 07:09:32.788475 4941 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-util\") on node \"crc\" DevicePath \"\"" Mar 07 07:09:32 crc kubenswrapper[4941]: I0307 07:09:32.788485 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r8ms\" (UniqueName: \"kubernetes.io/projected/8a078fed-092e-4d8e-8f31-2c1d6fb0ea10-kube-api-access-9r8ms\") on node \"crc\" DevicePath \"\"" Mar 07 07:09:33 crc kubenswrapper[4941]: I0307 07:09:33.237903 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" event={"ID":"8a078fed-092e-4d8e-8f31-2c1d6fb0ea10","Type":"ContainerDied","Data":"c591f5548cc01a338c59b416c59482c0cdcfbd2bb4c4530121a374b9757dc1ff"} Mar 07 07:09:33 crc kubenswrapper[4941]: I0307 07:09:33.237956 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c591f5548cc01a338c59b416c59482c0cdcfbd2bb4c4530121a374b9757dc1ff" Mar 07 07:09:33 crc kubenswrapper[4941]: I0307 07:09:33.238017 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5" Mar 07 07:09:39 crc kubenswrapper[4941]: I0307 07:09:39.975753 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-4bphf"] Mar 07 07:09:39 crc kubenswrapper[4941]: E0307 07:09:39.976965 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a078fed-092e-4d8e-8f31-2c1d6fb0ea10" containerName="extract" Mar 07 07:09:39 crc kubenswrapper[4941]: I0307 07:09:39.976985 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a078fed-092e-4d8e-8f31-2c1d6fb0ea10" containerName="extract" Mar 07 07:09:39 crc kubenswrapper[4941]: E0307 07:09:39.977002 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a078fed-092e-4d8e-8f31-2c1d6fb0ea10" containerName="pull" Mar 07 07:09:39 crc kubenswrapper[4941]: I0307 07:09:39.977010 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a078fed-092e-4d8e-8f31-2c1d6fb0ea10" containerName="pull" Mar 07 07:09:39 crc kubenswrapper[4941]: E0307 07:09:39.977021 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a078fed-092e-4d8e-8f31-2c1d6fb0ea10" containerName="util" Mar 07 07:09:39 crc kubenswrapper[4941]: I0307 07:09:39.977029 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a078fed-092e-4d8e-8f31-2c1d6fb0ea10" containerName="util" Mar 07 07:09:39 crc kubenswrapper[4941]: I0307 07:09:39.977172 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a078fed-092e-4d8e-8f31-2c1d6fb0ea10" containerName="extract" Mar 07 07:09:39 crc kubenswrapper[4941]: I0307 07:09:39.977726 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4bphf" Mar 07 07:09:39 crc kubenswrapper[4941]: I0307 07:09:39.982650 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-h4s92" Mar 07 07:09:40 crc kubenswrapper[4941]: I0307 07:09:40.066321 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-4bphf"] Mar 07 07:09:40 crc kubenswrapper[4941]: I0307 07:09:40.101341 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dkd7\" (UniqueName: \"kubernetes.io/projected/86d787b0-daa0-45e6-8c5f-a540f61ec19a-kube-api-access-2dkd7\") pod \"openstack-operator-controller-init-6f44f7b99f-4bphf\" (UID: \"86d787b0-daa0-45e6-8c5f-a540f61ec19a\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4bphf" Mar 07 07:09:40 crc kubenswrapper[4941]: I0307 07:09:40.202722 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dkd7\" (UniqueName: \"kubernetes.io/projected/86d787b0-daa0-45e6-8c5f-a540f61ec19a-kube-api-access-2dkd7\") pod \"openstack-operator-controller-init-6f44f7b99f-4bphf\" (UID: \"86d787b0-daa0-45e6-8c5f-a540f61ec19a\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4bphf" Mar 07 07:09:40 crc kubenswrapper[4941]: I0307 07:09:40.226903 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dkd7\" (UniqueName: \"kubernetes.io/projected/86d787b0-daa0-45e6-8c5f-a540f61ec19a-kube-api-access-2dkd7\") pod \"openstack-operator-controller-init-6f44f7b99f-4bphf\" (UID: \"86d787b0-daa0-45e6-8c5f-a540f61ec19a\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4bphf" Mar 07 07:09:40 crc kubenswrapper[4941]: I0307 07:09:40.296537 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4bphf" Mar 07 07:09:40 crc kubenswrapper[4941]: I0307 07:09:40.750491 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-4bphf"] Mar 07 07:09:41 crc kubenswrapper[4941]: I0307 07:09:41.314143 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4bphf" event={"ID":"86d787b0-daa0-45e6-8c5f-a540f61ec19a","Type":"ContainerStarted","Data":"2c925bce1621323203ee75f9015278e3d8c31793982ebe40efb9d602f3d9fe04"} Mar 07 07:09:46 crc kubenswrapper[4941]: I0307 07:09:46.351261 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4bphf" event={"ID":"86d787b0-daa0-45e6-8c5f-a540f61ec19a","Type":"ContainerStarted","Data":"c3c9c1812a62a7cbd24cb96bfc58c09b6d04233248cd47486bd64a9fa96efb85"} Mar 07 07:09:46 crc kubenswrapper[4941]: I0307 07:09:46.352244 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4bphf" Mar 07 07:09:46 crc kubenswrapper[4941]: I0307 07:09:46.384531 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4bphf" podStartSLOduration=2.701460421 podStartE2EDuration="7.384501233s" podCreationTimestamp="2026-03-07 07:09:39 +0000 UTC" firstStartedPulling="2026-03-07 07:09:40.760498387 +0000 UTC m=+1077.712863852" lastFinishedPulling="2026-03-07 07:09:45.443539189 +0000 UTC m=+1082.395904664" observedRunningTime="2026-03-07 07:09:46.382021032 +0000 UTC m=+1083.334386527" watchObservedRunningTime="2026-03-07 07:09:46.384501233 +0000 UTC m=+1083.336866738" Mar 07 07:09:50 crc kubenswrapper[4941]: I0307 07:09:50.300746 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4bphf" Mar 07 07:10:00 crc kubenswrapper[4941]: I0307 07:10:00.138435 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547790-5jgkv"] Mar 07 07:10:00 crc kubenswrapper[4941]: I0307 07:10:00.140039 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547790-5jgkv" Mar 07 07:10:00 crc kubenswrapper[4941]: I0307 07:10:00.142430 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:10:00 crc kubenswrapper[4941]: I0307 07:10:00.142561 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:10:00 crc kubenswrapper[4941]: I0307 07:10:00.143448 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:10:00 crc kubenswrapper[4941]: I0307 07:10:00.148183 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547790-5jgkv"] Mar 07 07:10:00 crc kubenswrapper[4941]: I0307 07:10:00.246018 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsdcl\" (UniqueName: \"kubernetes.io/projected/5bd90b16-b290-4cce-bf50-05e5c4f30a48-kube-api-access-lsdcl\") pod \"auto-csr-approver-29547790-5jgkv\" (UID: \"5bd90b16-b290-4cce-bf50-05e5c4f30a48\") " pod="openshift-infra/auto-csr-approver-29547790-5jgkv" Mar 07 07:10:00 crc kubenswrapper[4941]: I0307 07:10:00.347781 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsdcl\" (UniqueName: \"kubernetes.io/projected/5bd90b16-b290-4cce-bf50-05e5c4f30a48-kube-api-access-lsdcl\") pod \"auto-csr-approver-29547790-5jgkv\" (UID: \"5bd90b16-b290-4cce-bf50-05e5c4f30a48\") " pod="openshift-infra/auto-csr-approver-29547790-5jgkv" Mar 07 07:10:00 crc kubenswrapper[4941]: I0307 07:10:00.369490 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsdcl\" (UniqueName: \"kubernetes.io/projected/5bd90b16-b290-4cce-bf50-05e5c4f30a48-kube-api-access-lsdcl\") pod \"auto-csr-approver-29547790-5jgkv\" (UID: \"5bd90b16-b290-4cce-bf50-05e5c4f30a48\") " pod="openshift-infra/auto-csr-approver-29547790-5jgkv" Mar 07 07:10:00 crc kubenswrapper[4941]: I0307 07:10:00.460750 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547790-5jgkv" Mar 07 07:10:00 crc kubenswrapper[4941]: I0307 07:10:00.726336 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547790-5jgkv"] Mar 07 07:10:00 crc kubenswrapper[4941]: W0307 07:10:00.742626 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bd90b16_b290_4cce_bf50_05e5c4f30a48.slice/crio-d61ce65832a059c50659d931e8201c85659608dd267f4d138a5918a966e40661 WatchSource:0}: Error finding container d61ce65832a059c50659d931e8201c85659608dd267f4d138a5918a966e40661: Status 404 returned error can't find the container with id d61ce65832a059c50659d931e8201c85659608dd267f4d138a5918a966e40661 Mar 07 07:10:01 crc kubenswrapper[4941]: I0307 07:10:01.447833 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547790-5jgkv" event={"ID":"5bd90b16-b290-4cce-bf50-05e5c4f30a48","Type":"ContainerStarted","Data":"d61ce65832a059c50659d931e8201c85659608dd267f4d138a5918a966e40661"} Mar 07 07:10:02 crc kubenswrapper[4941]: I0307 07:10:02.458476 4941 generic.go:334] "Generic (PLEG): container finished" podID="5bd90b16-b290-4cce-bf50-05e5c4f30a48" containerID="6c7d9ada163db1d30ec3ff5c17caf22f26609e7b8d28d9fa92e24fda8b2a54d8" exitCode=0 Mar 07 07:10:02 crc kubenswrapper[4941]: I0307 07:10:02.458530 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547790-5jgkv" event={"ID":"5bd90b16-b290-4cce-bf50-05e5c4f30a48","Type":"ContainerDied","Data":"6c7d9ada163db1d30ec3ff5c17caf22f26609e7b8d28d9fa92e24fda8b2a54d8"} Mar 07 07:10:03 crc kubenswrapper[4941]: I0307 07:10:03.816304 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547790-5jgkv" Mar 07 07:10:03 crc kubenswrapper[4941]: I0307 07:10:03.997293 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsdcl\" (UniqueName: \"kubernetes.io/projected/5bd90b16-b290-4cce-bf50-05e5c4f30a48-kube-api-access-lsdcl\") pod \"5bd90b16-b290-4cce-bf50-05e5c4f30a48\" (UID: \"5bd90b16-b290-4cce-bf50-05e5c4f30a48\") " Mar 07 07:10:04 crc kubenswrapper[4941]: I0307 07:10:04.004315 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd90b16-b290-4cce-bf50-05e5c4f30a48-kube-api-access-lsdcl" (OuterVolumeSpecName: "kube-api-access-lsdcl") pod "5bd90b16-b290-4cce-bf50-05e5c4f30a48" (UID: "5bd90b16-b290-4cce-bf50-05e5c4f30a48"). InnerVolumeSpecName "kube-api-access-lsdcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:10:04 crc kubenswrapper[4941]: I0307 07:10:04.098846 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsdcl\" (UniqueName: \"kubernetes.io/projected/5bd90b16-b290-4cce-bf50-05e5c4f30a48-kube-api-access-lsdcl\") on node \"crc\" DevicePath \"\"" Mar 07 07:10:04 crc kubenswrapper[4941]: I0307 07:10:04.474187 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547790-5jgkv" event={"ID":"5bd90b16-b290-4cce-bf50-05e5c4f30a48","Type":"ContainerDied","Data":"d61ce65832a059c50659d931e8201c85659608dd267f4d138a5918a966e40661"} Mar 07 07:10:04 crc kubenswrapper[4941]: I0307 07:10:04.474793 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d61ce65832a059c50659d931e8201c85659608dd267f4d138a5918a966e40661" Mar 07 07:10:04 crc kubenswrapper[4941]: I0307 07:10:04.474220 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547790-5jgkv" Mar 07 07:10:04 crc kubenswrapper[4941]: I0307 07:10:04.881222 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547784-7g5m7"] Mar 07 07:10:04 crc kubenswrapper[4941]: I0307 07:10:04.886453 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547784-7g5m7"] Mar 07 07:10:05 crc kubenswrapper[4941]: I0307 07:10:05.961800 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89de2b71-4ba3-4287-8689-52d30ddea0bd" path="/var/lib/kubelet/pods/89de2b71-4ba3-4287-8689-52d30ddea0bd/volumes" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.485599 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sbpft"] Mar 07 07:10:09 crc kubenswrapper[4941]: E0307 07:10:09.486205 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd90b16-b290-4cce-bf50-05e5c4f30a48" containerName="oc" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.486218 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd90b16-b290-4cce-bf50-05e5c4f30a48" containerName="oc" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.486339 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd90b16-b290-4cce-bf50-05e5c4f30a48" containerName="oc" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.486775 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sbpft" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.490096 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-q5gvs" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.491930 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-t468c"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.492822 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-t468c" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.494939 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-p6tzj" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.499675 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-qlb5v"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.500380 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qlb5v" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.513619 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sbpft"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.523947 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-x4s94" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.538436 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-t468c"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.568714 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.569821 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.574762 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-vktbx" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.576332 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl5nk\" (UniqueName: \"kubernetes.io/projected/3df07af4-0aa2-4795-a129-22be2b991b9d-kube-api-access-wl5nk\") pod \"barbican-operator-controller-manager-6db6876945-t468c\" (UID: \"3df07af4-0aa2-4795-a129-22be2b991b9d\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-t468c" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.576478 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bgrt\" (UniqueName: \"kubernetes.io/projected/4194af88-c299-4713-a885-adb8cceedc13-kube-api-access-4bgrt\") pod \"cinder-operator-controller-manager-55d77d7b5c-sbpft\" (UID: \"4194af88-c299-4713-a885-adb8cceedc13\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sbpft" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.578841 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-qlb5v"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.589247 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.602889 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-bvmsw"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.605695 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-bvmsw" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.611601 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jrp9b" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.628343 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-7wr8g"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.629455 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-7wr8g" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.640324 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-8b6vz" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.649743 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-bvmsw"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.662538 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-7wr8g"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.681112 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl5nk\" (UniqueName: \"kubernetes.io/projected/3df07af4-0aa2-4795-a129-22be2b991b9d-kube-api-access-wl5nk\") pod \"barbican-operator-controller-manager-6db6876945-t468c\" (UID: \"3df07af4-0aa2-4795-a129-22be2b991b9d\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-t468c" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.681163 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cff8\" (UniqueName: \"kubernetes.io/projected/2158c14b-9b89-48d1-b76f-9b98bbfc6972-kube-api-access-7cff8\") pod \"designate-operator-controller-manager-5d87c9d997-qlb5v\" (UID: \"2158c14b-9b89-48d1-b76f-9b98bbfc6972\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qlb5v" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.681209 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48qrs\" (UniqueName: \"kubernetes.io/projected/a698d941-ce95-43c6-9512-1259d85a4cce-kube-api-access-48qrs\") pod \"glance-operator-controller-manager-64db6967f8-8qzr4\" (UID: \"a698d941-ce95-43c6-9512-1259d85a4cce\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.681284 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bgrt\" (UniqueName: \"kubernetes.io/projected/4194af88-c299-4713-a885-adb8cceedc13-kube-api-access-4bgrt\") pod \"cinder-operator-controller-manager-55d77d7b5c-sbpft\" (UID: \"4194af88-c299-4713-a885-adb8cceedc13\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sbpft" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.715165 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.716095 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.722788 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.723720 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vkd9w" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.727540 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl5nk\" (UniqueName: \"kubernetes.io/projected/3df07af4-0aa2-4795-a129-22be2b991b9d-kube-api-access-wl5nk\") pod \"barbican-operator-controller-manager-6db6876945-t468c\" (UID: \"3df07af4-0aa2-4795-a129-22be2b991b9d\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-t468c" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.751371 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.753227 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bgrt\" (UniqueName: \"kubernetes.io/projected/4194af88-c299-4713-a885-adb8cceedc13-kube-api-access-4bgrt\") pod \"cinder-operator-controller-manager-55d77d7b5c-sbpft\" (UID: \"4194af88-c299-4713-a885-adb8cceedc13\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sbpft" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.758125 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-kd7rp"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.759327 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-kd7rp" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.762506 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-6r26z"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.763524 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-6r26z" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.766245 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-kd7rp"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.780500 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-khmcz" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.785234 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czq94\" (UniqueName: \"kubernetes.io/projected/e803a3db-78f9-4d84-96a8-ffff5f62fe09-kube-api-access-czq94\") pod \"heat-operator-controller-manager-cf99c678f-bvmsw\" (UID: \"e803a3db-78f9-4d84-96a8-ffff5f62fe09\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-bvmsw" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.785293 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48qrs\" (UniqueName: \"kubernetes.io/projected/a698d941-ce95-43c6-9512-1259d85a4cce-kube-api-access-48qrs\") pod \"glance-operator-controller-manager-64db6967f8-8qzr4\" (UID: \"a698d941-ce95-43c6-9512-1259d85a4cce\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.785381 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftftq\" (UniqueName: \"kubernetes.io/projected/47672605-5408-4ff2-8b41-557efdcafbaf-kube-api-access-ftftq\") pod \"horizon-operator-controller-manager-78bc7f9bd9-7wr8g\" (UID: \"47672605-5408-4ff2-8b41-557efdcafbaf\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-7wr8g" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.785459 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cff8\" (UniqueName: \"kubernetes.io/projected/2158c14b-9b89-48d1-b76f-9b98bbfc6972-kube-api-access-7cff8\") pod \"designate-operator-controller-manager-5d87c9d997-qlb5v\" (UID: \"2158c14b-9b89-48d1-b76f-9b98bbfc6972\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qlb5v" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.787555 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-6r26z"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.787885 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-cvgjf" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.800530 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-mv6rg"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.801512 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mv6rg" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.812726 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sbpft" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.818435 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-xwgkg" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.856548 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-mv6rg"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.866295 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-t468c" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.867857 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cff8\" (UniqueName: \"kubernetes.io/projected/2158c14b-9b89-48d1-b76f-9b98bbfc6972-kube-api-access-7cff8\") pod \"designate-operator-controller-manager-5d87c9d997-qlb5v\" (UID: \"2158c14b-9b89-48d1-b76f-9b98bbfc6972\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qlb5v" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.873989 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qlb5v" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.887973 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvc57\" (UniqueName: \"kubernetes.io/projected/16a86642-eb42-44bd-b668-8295e2316f09-kube-api-access-fvc57\") pod \"manila-operator-controller-manager-67d996989d-mv6rg\" (UID: \"16a86642-eb42-44bd-b668-8295e2316f09\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-mv6rg" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.888034 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftftq\" (UniqueName: \"kubernetes.io/projected/47672605-5408-4ff2-8b41-557efdcafbaf-kube-api-access-ftftq\") pod \"horizon-operator-controller-manager-78bc7f9bd9-7wr8g\" (UID: \"47672605-5408-4ff2-8b41-557efdcafbaf\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-7wr8g" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.888056 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-5lfxq\" (UID: \"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.888137 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5jgt\" (UniqueName: \"kubernetes.io/projected/5dee621a-ccf7-486f-9865-fba380e4e1b1-kube-api-access-m5jgt\") pod \"keystone-operator-controller-manager-7c789f89c6-6r26z\" (UID: \"5dee621a-ccf7-486f-9865-fba380e4e1b1\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-6r26z" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.888170 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlszk\" (UniqueName: \"kubernetes.io/projected/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-kube-api-access-nlszk\") pod \"infra-operator-controller-manager-f7fcc58b9-5lfxq\" (UID: \"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.888203 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czq94\" (UniqueName: \"kubernetes.io/projected/e803a3db-78f9-4d84-96a8-ffff5f62fe09-kube-api-access-czq94\") pod \"heat-operator-controller-manager-cf99c678f-bvmsw\" (UID: \"e803a3db-78f9-4d84-96a8-ffff5f62fe09\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-bvmsw" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.888285 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9jws\" (UniqueName: \"kubernetes.io/projected/4ef2ce4a-5c3e-436c-bd56-dc15ac199bbf-kube-api-access-l9jws\") pod \"ironic-operator-controller-manager-545456dc4-kd7rp\" (UID: \"4ef2ce4a-5c3e-436c-bd56-dc15ac199bbf\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-kd7rp" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.896213 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48qrs\" (UniqueName: \"kubernetes.io/projected/a698d941-ce95-43c6-9512-1259d85a4cce-kube-api-access-48qrs\") pod \"glance-operator-controller-manager-64db6967f8-8qzr4\" (UID: \"a698d941-ce95-43c6-9512-1259d85a4cce\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.896287 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-bctkj"] Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.897324 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-bctkj" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.904338 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-gzs24" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.923134 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.923192 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czq94\" (UniqueName: \"kubernetes.io/projected/e803a3db-78f9-4d84-96a8-ffff5f62fe09-kube-api-access-czq94\") pod \"heat-operator-controller-manager-cf99c678f-bvmsw\" (UID: \"e803a3db-78f9-4d84-96a8-ffff5f62fe09\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-bvmsw" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.931343 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftftq\" (UniqueName: \"kubernetes.io/projected/47672605-5408-4ff2-8b41-557efdcafbaf-kube-api-access-ftftq\") pod \"horizon-operator-controller-manager-78bc7f9bd9-7wr8g\" (UID: \"47672605-5408-4ff2-8b41-557efdcafbaf\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-7wr8g" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.942640 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-bvmsw" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.958817 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-7wr8g" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.989650 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5jgt\" (UniqueName: \"kubernetes.io/projected/5dee621a-ccf7-486f-9865-fba380e4e1b1-kube-api-access-m5jgt\") pod \"keystone-operator-controller-manager-7c789f89c6-6r26z\" (UID: \"5dee621a-ccf7-486f-9865-fba380e4e1b1\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-6r26z" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.989702 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlszk\" (UniqueName: \"kubernetes.io/projected/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-kube-api-access-nlszk\") pod \"infra-operator-controller-manager-f7fcc58b9-5lfxq\" (UID: \"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.989759 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9jws\" (UniqueName: \"kubernetes.io/projected/4ef2ce4a-5c3e-436c-bd56-dc15ac199bbf-kube-api-access-l9jws\") pod \"ironic-operator-controller-manager-545456dc4-kd7rp\" (UID: \"4ef2ce4a-5c3e-436c-bd56-dc15ac199bbf\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-kd7rp" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.989796 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvc57\" (UniqueName: \"kubernetes.io/projected/16a86642-eb42-44bd-b668-8295e2316f09-kube-api-access-fvc57\") pod \"manila-operator-controller-manager-67d996989d-mv6rg\" (UID: \"16a86642-eb42-44bd-b668-8295e2316f09\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-mv6rg" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.989817 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-5lfxq\" (UID: \"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:09 crc kubenswrapper[4941]: I0307 07:10:09.989899 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfnjh\" (UniqueName: \"kubernetes.io/projected/f2b19678-f6f3-41fb-8534-f0b826b523f2-kube-api-access-cfnjh\") pod \"mariadb-operator-controller-manager-7b6bfb6475-bctkj\" (UID: \"f2b19678-f6f3-41fb-8534-f0b826b523f2\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-bctkj" Mar 07 07:10:10 crc kubenswrapper[4941]: E0307 07:10:09.990732 4941 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 07 07:10:10 crc kubenswrapper[4941]: E0307 07:10:09.990783 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert podName:466fcef1-3bd0-4fff-8e3b-c5dbea9cad30 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:10.490768461 +0000 UTC m=+1107.443133916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert") pod "infra-operator-controller-manager-f7fcc58b9-5lfxq" (UID: "466fcef1-3bd0-4fff-8e3b-c5dbea9cad30") : secret "infra-operator-webhook-server-cert" not found Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.006578 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-d7n9x"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.007312 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-d7n9x" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.013764 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-6n7pb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.028895 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlszk\" (UniqueName: \"kubernetes.io/projected/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-kube-api-access-nlszk\") pod \"infra-operator-controller-manager-f7fcc58b9-5lfxq\" (UID: \"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.049072 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9jws\" (UniqueName: \"kubernetes.io/projected/4ef2ce4a-5c3e-436c-bd56-dc15ac199bbf-kube-api-access-l9jws\") pod \"ironic-operator-controller-manager-545456dc4-kd7rp\" (UID: \"4ef2ce4a-5c3e-436c-bd56-dc15ac199bbf\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-kd7rp" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.050545 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-bctkj"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.051099 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvc57\" (UniqueName: \"kubernetes.io/projected/16a86642-eb42-44bd-b668-8295e2316f09-kube-api-access-fvc57\") pod \"manila-operator-controller-manager-67d996989d-mv6rg\" (UID: \"16a86642-eb42-44bd-b668-8295e2316f09\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-mv6rg" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.052978 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5jgt\" (UniqueName: \"kubernetes.io/projected/5dee621a-ccf7-486f-9865-fba380e4e1b1-kube-api-access-m5jgt\") pod \"keystone-operator-controller-manager-7c789f89c6-6r26z\" (UID: \"5dee621a-ccf7-486f-9865-fba380e4e1b1\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-6r26z" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.079370 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.083786 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.090926 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-d7n9x"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.091718 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfnjh\" (UniqueName: \"kubernetes.io/projected/f2b19678-f6f3-41fb-8534-f0b826b523f2-kube-api-access-cfnjh\") pod \"mariadb-operator-controller-manager-7b6bfb6475-bctkj\" (UID: \"f2b19678-f6f3-41fb-8534-f0b826b523f2\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-bctkj" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.103396 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx2zx\" (UniqueName: \"kubernetes.io/projected/b292f34f-0728-4c26-8122-3ac065824456-kube-api-access-dx2zx\") pod \"neutron-operator-controller-manager-54688575f-d7n9x\" (UID: \"b292f34f-0728-4c26-8122-3ac065824456\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-d7n9x" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.108450 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bcl2z" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.120135 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xjw4j"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.124568 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xjw4j" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.125223 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfnjh\" (UniqueName: \"kubernetes.io/projected/f2b19678-f6f3-41fb-8534-f0b826b523f2-kube-api-access-cfnjh\") pod \"mariadb-operator-controller-manager-7b6bfb6475-bctkj\" (UID: \"f2b19678-f6f3-41fb-8534-f0b826b523f2\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-bctkj" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.127228 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-vrvf6" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.135868 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xjw4j"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.171521 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.194040 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.195656 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.205261 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4s864" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.206410 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhbwz\" (UniqueName: \"kubernetes.io/projected/19e5c8c2-a9b7-41be-9a9c-9bc60ddd1478-kube-api-access-dhbwz\") pod \"nova-operator-controller-manager-74b6b5dc96-g86j9\" (UID: \"19e5c8c2-a9b7-41be-9a9c-9bc60ddd1478\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.206510 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvd5c\" (UniqueName: \"kubernetes.io/projected/56d9967c-b8ab-43e8-be9d-0593d1e3f320-kube-api-access-dvd5c\") pod \"octavia-operator-controller-manager-5d86c7ddb7-xjw4j\" (UID: \"56d9967c-b8ab-43e8-be9d-0593d1e3f320\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xjw4j" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.206559 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx2zx\" (UniqueName: \"kubernetes.io/projected/b292f34f-0728-4c26-8122-3ac065824456-kube-api-access-dx2zx\") pod \"neutron-operator-controller-manager-54688575f-d7n9x\" (UID: \"b292f34f-0728-4c26-8122-3ac065824456\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-d7n9x" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.217993 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.228397 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.243342 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-kd7rp" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.245120 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.249104 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-g4g8h" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.251428 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx2zx\" (UniqueName: \"kubernetes.io/projected/b292f34f-0728-4c26-8122-3ac065824456-kube-api-access-dx2zx\") pod \"neutron-operator-controller-manager-54688575f-d7n9x\" (UID: \"b292f34f-0728-4c26-8122-3ac065824456\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-d7n9x" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.259918 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-6r26z" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.271106 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.271379 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mv6rg" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.282817 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.283907 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-bctkj" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.285761 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.290036 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-5tjzj" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.312601 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhbwz\" (UniqueName: \"kubernetes.io/projected/19e5c8c2-a9b7-41be-9a9c-9bc60ddd1478-kube-api-access-dhbwz\") pod \"nova-operator-controller-manager-74b6b5dc96-g86j9\" (UID: \"19e5c8c2-a9b7-41be-9a9c-9bc60ddd1478\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.312900 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdpxm\" (UniqueName: \"kubernetes.io/projected/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-kube-api-access-vdpxm\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn\" (UID: \"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.312999 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvd5c\" (UniqueName: \"kubernetes.io/projected/56d9967c-b8ab-43e8-be9d-0593d1e3f320-kube-api-access-dvd5c\") pod \"octavia-operator-controller-manager-5d86c7ddb7-xjw4j\" (UID: \"56d9967c-b8ab-43e8-be9d-0593d1e3f320\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xjw4j" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.313118 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn\" (UID: \"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.325510 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-pbkqb"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.326518 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-pbkqb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.332344 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zc7lv" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.332989 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.339719 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-pbkqb"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.347846 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.348498 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-d7n9x" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.349333 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvd5c\" (UniqueName: \"kubernetes.io/projected/56d9967c-b8ab-43e8-be9d-0593d1e3f320-kube-api-access-dvd5c\") pod \"octavia-operator-controller-manager-5d86c7ddb7-xjw4j\" (UID: \"56d9967c-b8ab-43e8-be9d-0593d1e3f320\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xjw4j" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.354719 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhbwz\" (UniqueName: \"kubernetes.io/projected/19e5c8c2-a9b7-41be-9a9c-9bc60ddd1478-kube-api-access-dhbwz\") pod \"nova-operator-controller-manager-74b6b5dc96-g86j9\" (UID: \"19e5c8c2-a9b7-41be-9a9c-9bc60ddd1478\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.358712 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.360240 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.363665 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.365974 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-tgxsn" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.373757 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.376419 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.379346 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-rfz4s" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.383460 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.389131 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-8t2kp"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.391602 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-8t2kp" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.393852 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-t6xh9" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.395942 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-8t2kp"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.414097 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jntqr\" (UniqueName: \"kubernetes.io/projected/a1f803c1-f954-4aff-b54e-2baae04f1bbf-kube-api-access-jntqr\") pod \"swift-operator-controller-manager-9b9ff9f4d-pbkqb\" (UID: \"a1f803c1-f954-4aff-b54e-2baae04f1bbf\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-pbkqb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.414147 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn\" (UID: \"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.414200 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wfrz\" (UniqueName: \"kubernetes.io/projected/0dfad6cb-1bbf-4af8-bd06-efc92bfd4347-kube-api-access-9wfrz\") pod \"placement-operator-controller-manager-648564c9fc-6srmj\" (UID: \"0dfad6cb-1bbf-4af8-bd06-efc92bfd4347\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.414227 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzlk2\" (UniqueName: \"kubernetes.io/projected/c0b92dab-fef5-4bf2-b07d-f3787dc8060c-kube-api-access-qzlk2\") pod \"ovn-operator-controller-manager-75684d597f-kbm66\" (UID: \"c0b92dab-fef5-4bf2-b07d-f3787dc8060c\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.414247 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdpxm\" (UniqueName: \"kubernetes.io/projected/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-kube-api-access-vdpxm\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn\" (UID: \"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:10 crc kubenswrapper[4941]: E0307 07:10:10.414607 4941 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 07:10:10 crc kubenswrapper[4941]: E0307 07:10:10.414653 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert podName:63bae88f-5e2e-4e53-9e5e-e7d31ca511d1 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:10.914639552 +0000 UTC m=+1107.867005017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" (UID: "63bae88f-5e2e-4e53-9e5e-e7d31ca511d1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.415221 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.416194 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.418691 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.419270 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.420318 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-qv55g" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.422500 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.453896 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdpxm\" (UniqueName: \"kubernetes.io/projected/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-kube-api-access-vdpxm\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn\" (UID: \"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.458091 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.459292 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.469584 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-b7ztq" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.473717 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.474863 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.475465 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xjw4j" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.515294 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wfrz\" (UniqueName: \"kubernetes.io/projected/0dfad6cb-1bbf-4af8-bd06-efc92bfd4347-kube-api-access-9wfrz\") pod \"placement-operator-controller-manager-648564c9fc-6srmj\" (UID: \"0dfad6cb-1bbf-4af8-bd06-efc92bfd4347\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.515373 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzlk2\" (UniqueName: \"kubernetes.io/projected/c0b92dab-fef5-4bf2-b07d-f3787dc8060c-kube-api-access-qzlk2\") pod \"ovn-operator-controller-manager-75684d597f-kbm66\" (UID: \"c0b92dab-fef5-4bf2-b07d-f3787dc8060c\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.515455 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rtm6\" (UniqueName: \"kubernetes.io/projected/87e61107-4868-497d-a7fa-73f56f084ff2-kube-api-access-7rtm6\") pod \"test-operator-controller-manager-55b5ff4dbb-hqpm5\" (UID: \"87e61107-4868-497d-a7fa-73f56f084ff2\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.515520 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.515556 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mrr5\" (UniqueName: \"kubernetes.io/projected/5dbee8a7-f5e1-44df-ae39-850574975086-kube-api-access-9mrr5\") pod \"telemetry-operator-controller-manager-5fdb694969-wtfh5\" (UID: \"5dbee8a7-f5e1-44df-ae39-850574975086\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.515620 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-5lfxq\" (UID: \"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.515662 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jntqr\" (UniqueName: \"kubernetes.io/projected/a1f803c1-f954-4aff-b54e-2baae04f1bbf-kube-api-access-jntqr\") pod \"swift-operator-controller-manager-9b9ff9f4d-pbkqb\" (UID: \"a1f803c1-f954-4aff-b54e-2baae04f1bbf\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-pbkqb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.515694 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v257p\" (UniqueName: \"kubernetes.io/projected/90ee94e8-276b-476f-a6b6-4729bbd5fab3-kube-api-access-v257p\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.515721 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.515824 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njqwv\" (UniqueName: \"kubernetes.io/projected/12f08e48-5775-4ba5-8321-e68eee8fd2c6-kube-api-access-njqwv\") pod \"watcher-operator-controller-manager-bccc79885-8t2kp\" (UID: \"12f08e48-5775-4ba5-8321-e68eee8fd2c6\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-8t2kp" Mar 07 07:10:10 crc kubenswrapper[4941]: E0307 07:10:10.516035 4941 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 07 07:10:10 crc kubenswrapper[4941]: E0307 07:10:10.516090 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert podName:466fcef1-3bd0-4fff-8e3b-c5dbea9cad30 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:11.516070649 +0000 UTC m=+1108.468436114 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert") pod "infra-operator-controller-manager-f7fcc58b9-5lfxq" (UID: "466fcef1-3bd0-4fff-8e3b-c5dbea9cad30") : secret "infra-operator-webhook-server-cert" not found Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.547846 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wfrz\" (UniqueName: \"kubernetes.io/projected/0dfad6cb-1bbf-4af8-bd06-efc92bfd4347-kube-api-access-9wfrz\") pod \"placement-operator-controller-manager-648564c9fc-6srmj\" (UID: \"0dfad6cb-1bbf-4af8-bd06-efc92bfd4347\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.558790 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jntqr\" (UniqueName: \"kubernetes.io/projected/a1f803c1-f954-4aff-b54e-2baae04f1bbf-kube-api-access-jntqr\") pod \"swift-operator-controller-manager-9b9ff9f4d-pbkqb\" (UID: \"a1f803c1-f954-4aff-b54e-2baae04f1bbf\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-pbkqb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.560085 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzlk2\" (UniqueName: \"kubernetes.io/projected/c0b92dab-fef5-4bf2-b07d-f3787dc8060c-kube-api-access-qzlk2\") pod \"ovn-operator-controller-manager-75684d597f-kbm66\" (UID: \"c0b92dab-fef5-4bf2-b07d-f3787dc8060c\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.619060 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rtm6\" (UniqueName: \"kubernetes.io/projected/87e61107-4868-497d-a7fa-73f56f084ff2-kube-api-access-7rtm6\") pod \"test-operator-controller-manager-55b5ff4dbb-hqpm5\" (UID: \"87e61107-4868-497d-a7fa-73f56f084ff2\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.619108 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4qtn\" (UniqueName: \"kubernetes.io/projected/42861835-d760-4e61-b9c2-cd0f3e3478d8-kube-api-access-b4qtn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wwtfx\" (UID: \"42861835-d760-4e61-b9c2-cd0f3e3478d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.619135 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.619155 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mrr5\" (UniqueName: \"kubernetes.io/projected/5dbee8a7-f5e1-44df-ae39-850574975086-kube-api-access-9mrr5\") pod \"telemetry-operator-controller-manager-5fdb694969-wtfh5\" (UID: \"5dbee8a7-f5e1-44df-ae39-850574975086\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.619193 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v257p\" (UniqueName: \"kubernetes.io/projected/90ee94e8-276b-476f-a6b6-4729bbd5fab3-kube-api-access-v257p\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.619208 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.619246 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njqwv\" (UniqueName: \"kubernetes.io/projected/12f08e48-5775-4ba5-8321-e68eee8fd2c6-kube-api-access-njqwv\") pod \"watcher-operator-controller-manager-bccc79885-8t2kp\" (UID: \"12f08e48-5775-4ba5-8321-e68eee8fd2c6\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-8t2kp" Mar 07 07:10:10 crc kubenswrapper[4941]: E0307 07:10:10.619431 4941 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 07 07:10:10 crc kubenswrapper[4941]: E0307 07:10:10.619473 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs podName:90ee94e8-276b-476f-a6b6-4729bbd5fab3 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:11.119457704 +0000 UTC m=+1108.071823169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-9pblb" (UID: "90ee94e8-276b-476f-a6b6-4729bbd5fab3") : secret "metrics-server-cert" not found Mar 07 07:10:10 crc kubenswrapper[4941]: E0307 07:10:10.619667 4941 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 07 07:10:10 crc kubenswrapper[4941]: E0307 07:10:10.619695 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs podName:90ee94e8-276b-476f-a6b6-4729bbd5fab3 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:11.11968829 +0000 UTC m=+1108.072053745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-9pblb" (UID: "90ee94e8-276b-476f-a6b6-4729bbd5fab3") : secret "webhook-server-cert" not found Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.640845 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njqwv\" (UniqueName: \"kubernetes.io/projected/12f08e48-5775-4ba5-8321-e68eee8fd2c6-kube-api-access-njqwv\") pod \"watcher-operator-controller-manager-bccc79885-8t2kp\" (UID: \"12f08e48-5775-4ba5-8321-e68eee8fd2c6\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-8t2kp" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.644449 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rtm6\" (UniqueName: \"kubernetes.io/projected/87e61107-4868-497d-a7fa-73f56f084ff2-kube-api-access-7rtm6\") pod \"test-operator-controller-manager-55b5ff4dbb-hqpm5\" (UID: \"87e61107-4868-497d-a7fa-73f56f084ff2\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.648220 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mrr5\" (UniqueName: \"kubernetes.io/projected/5dbee8a7-f5e1-44df-ae39-850574975086-kube-api-access-9mrr5\") pod \"telemetry-operator-controller-manager-5fdb694969-wtfh5\" (UID: \"5dbee8a7-f5e1-44df-ae39-850574975086\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.654950 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v257p\" (UniqueName: \"kubernetes.io/projected/90ee94e8-276b-476f-a6b6-4729bbd5fab3-kube-api-access-v257p\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.694590 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.722449 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4qtn\" (UniqueName: \"kubernetes.io/projected/42861835-d760-4e61-b9c2-cd0f3e3478d8-kube-api-access-b4qtn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wwtfx\" (UID: \"42861835-d760-4e61-b9c2-cd0f3e3478d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.732473 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.761973 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-pbkqb" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.803243 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.816469 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.832527 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4qtn\" (UniqueName: \"kubernetes.io/projected/42861835-d760-4e61-b9c2-cd0f3e3478d8-kube-api-access-b4qtn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wwtfx\" (UID: \"42861835-d760-4e61-b9c2-cd0f3e3478d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.833149 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-8t2kp" Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.846163 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sbpft"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.852511 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.861878 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-t468c"] Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.875695 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-bvmsw"] Mar 07 07:10:10 crc kubenswrapper[4941]: W0307 07:10:10.881816 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode803a3db_78f9_4d84_96a8_ffff5f62fe09.slice/crio-2fd4aeaea9fba7cd2d637785112d02fc36dd25e457d69185cc7895c9b14ced2f WatchSource:0}: Error finding container 2fd4aeaea9fba7cd2d637785112d02fc36dd25e457d69185cc7895c9b14ced2f: Status 404 returned error can't find the container with id 2fd4aeaea9fba7cd2d637785112d02fc36dd25e457d69185cc7895c9b14ced2f Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.884077 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx" Mar 07 07:10:10 crc kubenswrapper[4941]: W0307 07:10:10.884675 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda698d941_ce95_43c6_9512_1259d85a4cce.slice/crio-a059eb1d6249c1303391d3a13ba3f45a8c9fdb2b82c132ee3562bf659d7d0cd9 WatchSource:0}: Error finding container a059eb1d6249c1303391d3a13ba3f45a8c9fdb2b82c132ee3562bf659d7d0cd9: Status 404 returned error can't find the container with id a059eb1d6249c1303391d3a13ba3f45a8c9fdb2b82c132ee3562bf659d7d0cd9 Mar 07 07:10:10 crc kubenswrapper[4941]: I0307 07:10:10.928092 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn\" (UID: \"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:10 crc kubenswrapper[4941]: E0307 07:10:10.930054 4941 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 07:10:10 crc kubenswrapper[4941]: E0307 07:10:10.930128 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert podName:63bae88f-5e2e-4e53-9e5e-e7d31ca511d1 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:11.930097011 +0000 UTC m=+1108.882462476 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" (UID: "63bae88f-5e2e-4e53-9e5e-e7d31ca511d1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.069601 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-qlb5v"] Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.088857 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-7wr8g"] Mar 07 07:10:11 crc kubenswrapper[4941]: W0307 07:10:11.100809 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2158c14b_9b89_48d1_b76f_9b98bbfc6972.slice/crio-e0f4dcb3609da17009e317893ae922c237b007338f94d9c02c0d2c07d6e161f5 WatchSource:0}: Error finding container e0f4dcb3609da17009e317893ae922c237b007338f94d9c02c0d2c07d6e161f5: Status 404 returned error can't find the container with id e0f4dcb3609da17009e317893ae922c237b007338f94d9c02c0d2c07d6e161f5 Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.131926 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.132000 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.132508 4941 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.132570 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs podName:90ee94e8-276b-476f-a6b6-4729bbd5fab3 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:12.132553196 +0000 UTC m=+1109.084918661 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-9pblb" (UID: "90ee94e8-276b-476f-a6b6-4729bbd5fab3") : secret "metrics-server-cert" not found Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.132852 4941 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.132886 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs podName:90ee94e8-276b-476f-a6b6-4729bbd5fab3 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:12.132877324 +0000 UTC m=+1109.085242789 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-9pblb" (UID: "90ee94e8-276b-476f-a6b6-4729bbd5fab3") : secret "webhook-server-cert" not found Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.239279 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-6r26z"] Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.248440 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-bctkj"] Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.254076 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-kd7rp"] Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.264258 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-mv6rg"] Mar 07 07:10:11 crc kubenswrapper[4941]: W0307 07:10:11.269454 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2b19678_f6f3_41fb_8534_f0b826b523f2.slice/crio-512728aed553979ad992f595bb710276471d29928cac298cdec532c96dbd462f WatchSource:0}: Error finding container 512728aed553979ad992f595bb710276471d29928cac298cdec532c96dbd462f: Status 404 returned error can't find the container with id 512728aed553979ad992f595bb710276471d29928cac298cdec532c96dbd462f Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.283698 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-d7n9x"] Mar 07 07:10:11 crc kubenswrapper[4941]: W0307 07:10:11.285056 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ef2ce4a_5c3e_436c_bd56_dc15ac199bbf.slice/crio-9a6e1ce9e5921a1f470f5156c805a395ae4aa9ddb711dcfcea2295990e83f987 WatchSource:0}: Error finding container 9a6e1ce9e5921a1f470f5156c805a395ae4aa9ddb711dcfcea2295990e83f987: Status 404 returned error can't find the container with id 9a6e1ce9e5921a1f470f5156c805a395ae4aa9ddb711dcfcea2295990e83f987 Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.448388 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xjw4j"] Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.466041 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9"] Mar 07 07:10:11 crc kubenswrapper[4941]: W0307 07:10:11.489072 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19e5c8c2_a9b7_41be_9a9c_9bc60ddd1478.slice/crio-abda65ef52df9a2e2cb918c4bdc3efa49be240486d8f2c7a70be62020e7ebb55 WatchSource:0}: Error finding container abda65ef52df9a2e2cb918c4bdc3efa49be240486d8f2c7a70be62020e7ebb55: Status 404 returned error can't find the container with id abda65ef52df9a2e2cb918c4bdc3efa49be240486d8f2c7a70be62020e7ebb55 Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.492002 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dhbwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-74b6b5dc96-g86j9_openstack-operators(19e5c8c2-a9b7-41be-9a9c-9bc60ddd1478): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.494682 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9" podUID="19e5c8c2-a9b7-41be-9a9c-9bc60ddd1478" Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.536562 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-5lfxq\" (UID: \"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.536816 4941 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.536914 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert podName:466fcef1-3bd0-4fff-8e3b-c5dbea9cad30 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:13.536890462 +0000 UTC m=+1110.489255977 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert") pod "infra-operator-controller-manager-f7fcc58b9-5lfxq" (UID: "466fcef1-3bd0-4fff-8e3b-c5dbea9cad30") : secret "infra-operator-webhook-server-cert" not found Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.550379 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx"] Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.567828 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj"] Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.570779 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-d7n9x" event={"ID":"b292f34f-0728-4c26-8122-3ac065824456","Type":"ContainerStarted","Data":"29648d4e0f38d37366ab7de02fa4eeedecb43e0329ae56db6f92fdc0833a0616"} Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.572309 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-kd7rp" event={"ID":"4ef2ce4a-5c3e-436c-bd56-dc15ac199bbf","Type":"ContainerStarted","Data":"9a6e1ce9e5921a1f470f5156c805a395ae4aa9ddb711dcfcea2295990e83f987"} Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.573575 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xjw4j" event={"ID":"56d9967c-b8ab-43e8-be9d-0593d1e3f320","Type":"ContainerStarted","Data":"5475addaa54952e5f082f5435b091562ee0547b51dcb02fa39b136553e80d1be"} Mar 07 07:10:11 crc kubenswrapper[4941]: W0307 07:10:11.574882 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42861835_d760_4e61_b9c2_cd0f3e3478d8.slice/crio-1b9b5a227a80f5a28293e7ebdf40fd9fc3987f3ded23ae16e70c7971fa597497 WatchSource:0}: Error finding container 1b9b5a227a80f5a28293e7ebdf40fd9fc3987f3ded23ae16e70c7971fa597497: Status 404 returned error can't find the container with id 1b9b5a227a80f5a28293e7ebdf40fd9fc3987f3ded23ae16e70c7971fa597497 Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.575940 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-pbkqb"] Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.578986 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-bctkj" event={"ID":"f2b19678-f6f3-41fb-8534-f0b826b523f2","Type":"ContainerStarted","Data":"512728aed553979ad992f595bb710276471d29928cac298cdec532c96dbd462f"} Mar 07 07:10:11 crc kubenswrapper[4941]: W0307 07:10:11.582100 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1f803c1_f954_4aff_b54e_2baae04f1bbf.slice/crio-0c24793e591340073c6aac594192872fa3b4766cf9b729f6617d7c068d119ddd WatchSource:0}: Error finding container 0c24793e591340073c6aac594192872fa3b4766cf9b729f6617d7c068d119ddd: Status 404 returned error can't find the container with id 0c24793e591340073c6aac594192872fa3b4766cf9b729f6617d7c068d119ddd Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.582297 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-t468c" event={"ID":"3df07af4-0aa2-4795-a129-22be2b991b9d","Type":"ContainerStarted","Data":"bdb5807ec72f3d292016e5f8919f3e807d0e95c00aee19b3b01b352ce3594243"} Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.582357 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66"] Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.584845 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sbpft" event={"ID":"4194af88-c299-4713-a885-adb8cceedc13","Type":"ContainerStarted","Data":"1aa1a2d9aa5a8959dabd76577e99530ab379e71cfbc634e03dbab29390d0105d"} Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.586313 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4" event={"ID":"a698d941-ce95-43c6-9512-1259d85a4cce","Type":"ContainerStarted","Data":"a059eb1d6249c1303391d3a13ba3f45a8c9fdb2b82c132ee3562bf659d7d0cd9"} Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.590483 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mv6rg" event={"ID":"16a86642-eb42-44bd-b668-8295e2316f09","Type":"ContainerStarted","Data":"0039441c1ac290f0842e05613be9d2b31020b100a1b64667fa037548201f755a"} Mar 07 07:10:11 crc kubenswrapper[4941]: W0307 07:10:11.592979 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dfad6cb_1bbf_4af8_bd06_efc92bfd4347.slice/crio-ce5e2f128498c2ebe77c5b7fc29dcc4ce512a50a387952f564a5126245d875b8 WatchSource:0}: Error finding container ce5e2f128498c2ebe77c5b7fc29dcc4ce512a50a387952f564a5126245d875b8: Status 404 returned error can't find the container with id ce5e2f128498c2ebe77c5b7fc29dcc4ce512a50a387952f564a5126245d875b8 Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.595329 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5"] Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.602130 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-7wr8g" event={"ID":"47672605-5408-4ff2-8b41-557efdcafbaf","Type":"ContainerStarted","Data":"d195072a8d6f2e0909fa522658168251e05783a85297ba4ea40b2e101234fdec"} Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.603340 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qzlk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-75684d597f-kbm66_openstack-operators(c0b92dab-fef5-4bf2-b07d-f3787dc8060c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.603859 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9wfrz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-6srmj_openstack-operators(0dfad6cb-1bbf-4af8-bd06-efc92bfd4347): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 07 07:10:11 crc kubenswrapper[4941]: W0307 07:10:11.604047 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87e61107_4868_497d_a7fa_73f56f084ff2.slice/crio-1057be848bc8b106a84e99534c132e0c230cd33c703af2a7f4b713ec15ebaca2 WatchSource:0}: Error finding container 1057be848bc8b106a84e99534c132e0c230cd33c703af2a7f4b713ec15ebaca2: Status 404 returned error can't find the container with id 1057be848bc8b106a84e99534c132e0c230cd33c703af2a7f4b713ec15ebaca2 Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.604392 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b4qtn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-wwtfx_openstack-operators(42861835-d760-4e61-b9c2-cd0f3e3478d8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.604458 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" podUID="c0b92dab-fef5-4bf2-b07d-f3787dc8060c" Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.604955 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj" podUID="0dfad6cb-1bbf-4af8-bd06-efc92bfd4347" Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.605005 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9" event={"ID":"19e5c8c2-a9b7-41be-9a9c-9bc60ddd1478","Type":"ContainerStarted","Data":"abda65ef52df9a2e2cb918c4bdc3efa49be240486d8f2c7a70be62020e7ebb55"} Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.605578 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx" podUID="42861835-d760-4e61-b9c2-cd0f3e3478d8" Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.606270 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9" podUID="19e5c8c2-a9b7-41be-9a9c-9bc60ddd1478" Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.608642 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5"] Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.609210 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-bvmsw" event={"ID":"e803a3db-78f9-4d84-96a8-ffff5f62fe09","Type":"ContainerStarted","Data":"2fd4aeaea9fba7cd2d637785112d02fc36dd25e457d69185cc7895c9b14ced2f"} Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.609265 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7rtm6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-55b5ff4dbb-hqpm5_openstack-operators(87e61107-4868-497d-a7fa-73f56f084ff2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.610723 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5" podUID="87e61107-4868-497d-a7fa-73f56f084ff2" Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.617088 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-6r26z" event={"ID":"5dee621a-ccf7-486f-9865-fba380e4e1b1","Type":"ContainerStarted","Data":"fd6a547765053a92f0fb21a00b78920f0e82afa23d502d2a2532c1eaa0708cc4"} Mar 07 07:10:11 crc kubenswrapper[4941]: W0307 07:10:11.617313 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dbee8a7_f5e1_44df_ae39_850574975086.slice/crio-f85762ff073e5c3b79d57a5b86ba5c0e688bda93a337f6a1b469daf627bb0558 WatchSource:0}: Error finding container f85762ff073e5c3b79d57a5b86ba5c0e688bda93a337f6a1b469daf627bb0558: Status 404 returned error can't find the container with id f85762ff073e5c3b79d57a5b86ba5c0e688bda93a337f6a1b469daf627bb0558 Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.618680 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qlb5v" event={"ID":"2158c14b-9b89-48d1-b76f-9b98bbfc6972","Type":"ContainerStarted","Data":"e0f4dcb3609da17009e317893ae922c237b007338f94d9c02c0d2c07d6e161f5"} Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.621163 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9mrr5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5fdb694969-wtfh5_openstack-operators(5dbee8a7-f5e1-44df-ae39-850574975086): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.622276 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5" podUID="5dbee8a7-f5e1-44df-ae39-850574975086" Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.726137 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-8t2kp"] Mar 07 07:10:11 crc kubenswrapper[4941]: W0307 07:10:11.733370 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12f08e48_5775_4ba5_8321_e68eee8fd2c6.slice/crio-41e0e3806b4a0b233a9d8aacbac27f96d889391b91d13b1f5ba7161bf53ad1d3 WatchSource:0}: Error finding container 41e0e3806b4a0b233a9d8aacbac27f96d889391b91d13b1f5ba7161bf53ad1d3: Status 404 returned error can't find the container with id 41e0e3806b4a0b233a9d8aacbac27f96d889391b91d13b1f5ba7161bf53ad1d3 Mar 07 07:10:11 crc kubenswrapper[4941]: I0307 07:10:11.941581 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn\" (UID: \"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.941841 4941 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 07:10:11 crc kubenswrapper[4941]: E0307 07:10:11.941947 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert podName:63bae88f-5e2e-4e53-9e5e-e7d31ca511d1 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:13.941922895 +0000 UTC m=+1110.894288360 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" (UID: "63bae88f-5e2e-4e53-9e5e-e7d31ca511d1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 07:10:12 crc kubenswrapper[4941]: I0307 07:10:12.144232 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:12 crc kubenswrapper[4941]: I0307 07:10:12.144416 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:12 crc kubenswrapper[4941]: E0307 07:10:12.144471 4941 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 07 07:10:12 crc kubenswrapper[4941]: E0307 07:10:12.144566 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs podName:90ee94e8-276b-476f-a6b6-4729bbd5fab3 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:14.144544584 +0000 UTC m=+1111.096910049 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-9pblb" (UID: "90ee94e8-276b-476f-a6b6-4729bbd5fab3") : secret "metrics-server-cert" not found Mar 07 07:10:12 crc kubenswrapper[4941]: E0307 07:10:12.144597 4941 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 07 07:10:12 crc kubenswrapper[4941]: E0307 07:10:12.144689 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs podName:90ee94e8-276b-476f-a6b6-4729bbd5fab3 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:14.144669847 +0000 UTC m=+1111.097035312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-9pblb" (UID: "90ee94e8-276b-476f-a6b6-4729bbd5fab3") : secret "webhook-server-cert" not found Mar 07 07:10:12 crc kubenswrapper[4941]: I0307 07:10:12.648007 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5" event={"ID":"87e61107-4868-497d-a7fa-73f56f084ff2","Type":"ContainerStarted","Data":"1057be848bc8b106a84e99534c132e0c230cd33c703af2a7f4b713ec15ebaca2"} Mar 07 07:10:12 crc kubenswrapper[4941]: E0307 07:10:12.655269 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5" podUID="87e61107-4868-497d-a7fa-73f56f084ff2" Mar 07 07:10:12 crc kubenswrapper[4941]: I0307 07:10:12.665820 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx" event={"ID":"42861835-d760-4e61-b9c2-cd0f3e3478d8","Type":"ContainerStarted","Data":"1b9b5a227a80f5a28293e7ebdf40fd9fc3987f3ded23ae16e70c7971fa597497"} Mar 07 07:10:12 crc kubenswrapper[4941]: E0307 07:10:12.669140 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx" podUID="42861835-d760-4e61-b9c2-cd0f3e3478d8" Mar 07 07:10:12 crc kubenswrapper[4941]: I0307 07:10:12.682376 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5" event={"ID":"5dbee8a7-f5e1-44df-ae39-850574975086","Type":"ContainerStarted","Data":"f85762ff073e5c3b79d57a5b86ba5c0e688bda93a337f6a1b469daf627bb0558"} Mar 07 07:10:12 crc kubenswrapper[4941]: I0307 07:10:12.686122 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" event={"ID":"c0b92dab-fef5-4bf2-b07d-f3787dc8060c","Type":"ContainerStarted","Data":"42889d53520c34e75e6fd0958b832717b875c24129278277926c8ee699db5fcd"} Mar 07 07:10:12 crc kubenswrapper[4941]: E0307 07:10:12.686994 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5" podUID="5dbee8a7-f5e1-44df-ae39-850574975086" Mar 07 07:10:12 crc kubenswrapper[4941]: E0307 07:10:12.688024 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" podUID="c0b92dab-fef5-4bf2-b07d-f3787dc8060c" Mar 07 07:10:12 crc kubenswrapper[4941]: I0307 07:10:12.690736 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-pbkqb" event={"ID":"a1f803c1-f954-4aff-b54e-2baae04f1bbf","Type":"ContainerStarted","Data":"0c24793e591340073c6aac594192872fa3b4766cf9b729f6617d7c068d119ddd"} Mar 07 07:10:12 crc kubenswrapper[4941]: I0307 07:10:12.697797 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj" event={"ID":"0dfad6cb-1bbf-4af8-bd06-efc92bfd4347","Type":"ContainerStarted","Data":"ce5e2f128498c2ebe77c5b7fc29dcc4ce512a50a387952f564a5126245d875b8"} Mar 07 07:10:12 crc kubenswrapper[4941]: I0307 07:10:12.703329 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-8t2kp" event={"ID":"12f08e48-5775-4ba5-8321-e68eee8fd2c6","Type":"ContainerStarted","Data":"41e0e3806b4a0b233a9d8aacbac27f96d889391b91d13b1f5ba7161bf53ad1d3"} Mar 07 07:10:12 crc kubenswrapper[4941]: E0307 07:10:12.704526 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9" podUID="19e5c8c2-a9b7-41be-9a9c-9bc60ddd1478" Mar 07 07:10:12 crc kubenswrapper[4941]: E0307 07:10:12.704878 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj" podUID="0dfad6cb-1bbf-4af8-bd06-efc92bfd4347" Mar 07 07:10:13 crc kubenswrapper[4941]: I0307 07:10:13.567162 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-5lfxq\" (UID: \"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:13 crc kubenswrapper[4941]: E0307 07:10:13.567337 4941 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 07 07:10:13 crc kubenswrapper[4941]: E0307 07:10:13.567414 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert podName:466fcef1-3bd0-4fff-8e3b-c5dbea9cad30 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:17.567382375 +0000 UTC m=+1114.519747840 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert") pod "infra-operator-controller-manager-f7fcc58b9-5lfxq" (UID: "466fcef1-3bd0-4fff-8e3b-c5dbea9cad30") : secret "infra-operator-webhook-server-cert" not found Mar 07 07:10:13 crc kubenswrapper[4941]: E0307 07:10:13.712174 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5" podUID="5dbee8a7-f5e1-44df-ae39-850574975086" Mar 07 07:10:13 crc kubenswrapper[4941]: E0307 07:10:13.712547 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" podUID="c0b92dab-fef5-4bf2-b07d-f3787dc8060c" Mar 07 07:10:13 crc kubenswrapper[4941]: E0307 07:10:13.712610 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj" podUID="0dfad6cb-1bbf-4af8-bd06-efc92bfd4347" Mar 07 07:10:13 crc kubenswrapper[4941]: E0307 07:10:13.712720 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx" podUID="42861835-d760-4e61-b9c2-cd0f3e3478d8" Mar 07 07:10:13 crc kubenswrapper[4941]: E0307 07:10:13.714102 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5" podUID="87e61107-4868-497d-a7fa-73f56f084ff2" Mar 07 07:10:13 crc kubenswrapper[4941]: I0307 07:10:13.974770 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn\" (UID: \"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:13 crc kubenswrapper[4941]: E0307 07:10:13.975005 4941 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 07:10:13 crc kubenswrapper[4941]: E0307 07:10:13.975052 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert podName:63bae88f-5e2e-4e53-9e5e-e7d31ca511d1 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:17.975038852 +0000 UTC m=+1114.927404307 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" (UID: "63bae88f-5e2e-4e53-9e5e-e7d31ca511d1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 07:10:14 crc kubenswrapper[4941]: I0307 07:10:14.179052 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:14 crc kubenswrapper[4941]: I0307 07:10:14.179120 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:14 crc kubenswrapper[4941]: E0307 07:10:14.179294 4941 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 07 07:10:14 crc kubenswrapper[4941]: E0307 07:10:14.179333 4941 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 07 07:10:14 crc kubenswrapper[4941]: E0307 07:10:14.179349 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs podName:90ee94e8-276b-476f-a6b6-4729bbd5fab3 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:18.179332061 +0000 UTC m=+1115.131697526 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-9pblb" (UID: "90ee94e8-276b-476f-a6b6-4729bbd5fab3") : secret "webhook-server-cert" not found Mar 07 07:10:14 crc kubenswrapper[4941]: E0307 07:10:14.179454 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs podName:90ee94e8-276b-476f-a6b6-4729bbd5fab3 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:18.179435194 +0000 UTC m=+1115.131800659 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-9pblb" (UID: "90ee94e8-276b-476f-a6b6-4729bbd5fab3") : secret "metrics-server-cert" not found Mar 07 07:10:17 crc kubenswrapper[4941]: I0307 07:10:17.633145 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-5lfxq\" (UID: \"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:17 crc kubenswrapper[4941]: E0307 07:10:17.633329 4941 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 07 07:10:17 crc kubenswrapper[4941]: E0307 07:10:17.633813 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert podName:466fcef1-3bd0-4fff-8e3b-c5dbea9cad30 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:25.633793654 +0000 UTC m=+1122.586159119 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert") pod "infra-operator-controller-manager-f7fcc58b9-5lfxq" (UID: "466fcef1-3bd0-4fff-8e3b-c5dbea9cad30") : secret "infra-operator-webhook-server-cert" not found Mar 07 07:10:18 crc kubenswrapper[4941]: I0307 07:10:18.047429 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn\" (UID: \"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:18 crc kubenswrapper[4941]: E0307 07:10:18.048146 4941 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 07:10:18 crc kubenswrapper[4941]: E0307 07:10:18.048198 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert podName:63bae88f-5e2e-4e53-9e5e-e7d31ca511d1 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:26.048182915 +0000 UTC m=+1123.000548380 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" (UID: "63bae88f-5e2e-4e53-9e5e-e7d31ca511d1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 07:10:18 crc kubenswrapper[4941]: I0307 07:10:18.251460 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:18 crc kubenswrapper[4941]: I0307 07:10:18.251687 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:18 crc kubenswrapper[4941]: E0307 07:10:18.251732 4941 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 07 07:10:18 crc kubenswrapper[4941]: E0307 07:10:18.251832 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs podName:90ee94e8-276b-476f-a6b6-4729bbd5fab3 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:26.251806868 +0000 UTC m=+1123.204172503 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-9pblb" (UID: "90ee94e8-276b-476f-a6b6-4729bbd5fab3") : secret "webhook-server-cert" not found Mar 07 07:10:18 crc kubenswrapper[4941]: E0307 07:10:18.251883 4941 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 07 07:10:18 crc kubenswrapper[4941]: E0307 07:10:18.251972 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs podName:90ee94e8-276b-476f-a6b6-4729bbd5fab3 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:26.251942332 +0000 UTC m=+1123.204307977 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-9pblb" (UID: "90ee94e8-276b-476f-a6b6-4729bbd5fab3") : secret "metrics-server-cert" not found Mar 07 07:10:25 crc kubenswrapper[4941]: I0307 07:10:25.670108 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-5lfxq\" (UID: \"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:25 crc kubenswrapper[4941]: E0307 07:10:25.670466 4941 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 07 07:10:25 crc kubenswrapper[4941]: E0307 07:10:25.670973 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert podName:466fcef1-3bd0-4fff-8e3b-c5dbea9cad30 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:41.670926485 +0000 UTC m=+1138.623292000 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert") pod "infra-operator-controller-manager-f7fcc58b9-5lfxq" (UID: "466fcef1-3bd0-4fff-8e3b-c5dbea9cad30") : secret "infra-operator-webhook-server-cert" not found Mar 07 07:10:26 crc kubenswrapper[4941]: I0307 07:10:26.077963 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn\" (UID: \"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:26 crc kubenswrapper[4941]: E0307 07:10:26.078259 4941 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 07:10:26 crc kubenswrapper[4941]: E0307 07:10:26.079449 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert podName:63bae88f-5e2e-4e53-9e5e-e7d31ca511d1 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:42.079400291 +0000 UTC m=+1139.031765766 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" (UID: "63bae88f-5e2e-4e53-9e5e-e7d31ca511d1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 07:10:26 crc kubenswrapper[4941]: E0307 07:10:26.187498 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051" Mar 07 07:10:26 crc kubenswrapper[4941]: E0307 07:10:26.187729 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-48qrs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-64db6967f8-8qzr4_openstack-operators(a698d941-ce95-43c6-9512-1259d85a4cce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 07 07:10:26 crc kubenswrapper[4941]: E0307 07:10:26.189071 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4" podUID="a698d941-ce95-43c6-9512-1259d85a4cce" Mar 07 07:10:26 crc kubenswrapper[4941]: I0307 07:10:26.299367 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:26 crc kubenswrapper[4941]: I0307 07:10:26.299485 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:26 crc kubenswrapper[4941]: E0307 07:10:26.299634 4941 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 07 07:10:26 crc kubenswrapper[4941]: E0307 07:10:26.299667 4941 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 07 07:10:26 crc kubenswrapper[4941]: E0307 07:10:26.299740 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs podName:90ee94e8-276b-476f-a6b6-4729bbd5fab3 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:42.299717131 +0000 UTC m=+1139.252082606 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-9pblb" (UID: "90ee94e8-276b-476f-a6b6-4729bbd5fab3") : secret "metrics-server-cert" not found Mar 07 07:10:26 crc kubenswrapper[4941]: E0307 07:10:26.299763 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs podName:90ee94e8-276b-476f-a6b6-4729bbd5fab3 nodeName:}" failed. No retries permitted until 2026-03-07 07:10:42.299754602 +0000 UTC m=+1139.252120077 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-9pblb" (UID: "90ee94e8-276b-476f-a6b6-4729bbd5fab3") : secret "webhook-server-cert" not found Mar 07 07:10:27 crc kubenswrapper[4941]: E0307 07:10:27.062926 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051\\\"\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4" podUID="a698d941-ce95-43c6-9512-1259d85a4cce" Mar 07 07:10:27 crc kubenswrapper[4941]: E0307 07:10:27.779737 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120" Mar 07 07:10:27 crc kubenswrapper[4941]: E0307 07:10:27.779942 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wl5nk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-6db6876945-t468c_openstack-operators(3df07af4-0aa2-4795-a129-22be2b991b9d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 07 07:10:27 crc kubenswrapper[4941]: E0307 07:10:27.781218 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-t468c" podUID="3df07af4-0aa2-4795-a129-22be2b991b9d" Mar 07 07:10:27 crc kubenswrapper[4941]: E0307 07:10:27.819031 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-t468c" podUID="3df07af4-0aa2-4795-a129-22be2b991b9d" Mar 07 07:10:28 crc kubenswrapper[4941]: I0307 07:10:28.841792 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-7wr8g" event={"ID":"47672605-5408-4ff2-8b41-557efdcafbaf","Type":"ContainerStarted","Data":"fb0162214291f055f6a45a473a103ae36b80381cdd5476cfc35f61b0a9b0af81"} Mar 07 07:10:28 crc kubenswrapper[4941]: I0307 07:10:28.842877 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-7wr8g" Mar 07 07:10:28 crc kubenswrapper[4941]: I0307 07:10:28.870788 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xjw4j" event={"ID":"56d9967c-b8ab-43e8-be9d-0593d1e3f320","Type":"ContainerStarted","Data":"36330d08a9b767d49545da1b45b2ee184aca3563b731bccb4bb19e462375e68e"} Mar 07 07:10:28 crc kubenswrapper[4941]: I0307 07:10:28.871585 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xjw4j" Mar 07 07:10:28 crc kubenswrapper[4941]: I0307 07:10:28.886376 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-7wr8g" podStartSLOduration=3.168897547 podStartE2EDuration="19.886348473s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.101507871 +0000 UTC m=+1108.053873336" lastFinishedPulling="2026-03-07 07:10:27.818958787 +0000 UTC m=+1124.771324262" observedRunningTime="2026-03-07 07:10:28.871800029 +0000 UTC m=+1125.824165494" watchObservedRunningTime="2026-03-07 07:10:28.886348473 +0000 UTC m=+1125.838713938" Mar 07 07:10:28 crc kubenswrapper[4941]: I0307 07:10:28.891355 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-bvmsw" event={"ID":"e803a3db-78f9-4d84-96a8-ffff5f62fe09","Type":"ContainerStarted","Data":"ff4f7df52e5c3162105349fadacfcea25e14c484e5396c7ee310a4b7c62b5b59"} Mar 07 07:10:28 crc kubenswrapper[4941]: I0307 07:10:28.891905 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-bvmsw" Mar 07 07:10:28 crc kubenswrapper[4941]: I0307 07:10:28.897668 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xjw4j" podStartSLOduration=3.575169189 podStartE2EDuration="19.897645287s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.471562403 +0000 UTC m=+1108.423927868" lastFinishedPulling="2026-03-07 07:10:27.794038501 +0000 UTC m=+1124.746403966" observedRunningTime="2026-03-07 07:10:28.894279296 +0000 UTC m=+1125.846644761" watchObservedRunningTime="2026-03-07 07:10:28.897645287 +0000 UTC m=+1125.850010762" Mar 07 07:10:28 crc kubenswrapper[4941]: I0307 07:10:28.908049 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-d7n9x" event={"ID":"b292f34f-0728-4c26-8122-3ac065824456","Type":"ContainerStarted","Data":"23118fcd82725e0db09b6a7491a09b40eb5041fdbba81aee6bf834ad64884e22"} Mar 07 07:10:28 crc kubenswrapper[4941]: I0307 07:10:28.908864 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-d7n9x" Mar 07 07:10:28 crc kubenswrapper[4941]: I0307 07:10:28.941118 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qlb5v" event={"ID":"2158c14b-9b89-48d1-b76f-9b98bbfc6972","Type":"ContainerStarted","Data":"eeb9c018ee80c7fad7a4dcf156d78fd0eed90de151159314fdf6d2e4666846c0"} Mar 07 07:10:28 crc kubenswrapper[4941]: I0307 07:10:28.945454 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qlb5v" Mar 07 07:10:28 crc kubenswrapper[4941]: I0307 07:10:28.963113 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-bvmsw" podStartSLOduration=3.085103518 podStartE2EDuration="19.96309251s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:10.941244112 +0000 UTC m=+1107.893609577" lastFinishedPulling="2026-03-07 07:10:27.819233104 +0000 UTC m=+1124.771598569" observedRunningTime="2026-03-07 07:10:28.92486566 +0000 UTC m=+1125.877231125" watchObservedRunningTime="2026-03-07 07:10:28.96309251 +0000 UTC m=+1125.915457975" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.021887 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-d7n9x" podStartSLOduration=3.494970918 podStartE2EDuration="20.021870859s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.304184631 +0000 UTC m=+1108.256550096" lastFinishedPulling="2026-03-07 07:10:27.831084572 +0000 UTC m=+1124.783450037" observedRunningTime="2026-03-07 07:10:28.959476352 +0000 UTC m=+1125.911841817" watchObservedRunningTime="2026-03-07 07:10:29.021870859 +0000 UTC m=+1125.974236324" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.023778 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sbpft" event={"ID":"4194af88-c299-4713-a885-adb8cceedc13","Type":"ContainerStarted","Data":"19c7a6a34cf302d18a8eb7cd0973549419313f99720b9d9d91b025419e121933"} Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.023973 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sbpft" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.029963 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-kd7rp" event={"ID":"4ef2ce4a-5c3e-436c-bd56-dc15ac199bbf","Type":"ContainerStarted","Data":"2d191e5b002efa0c905920b6d62cff0a20ee627c772480c3f68054a486d43249"} Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.030574 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-kd7rp" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.052287 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-8t2kp" event={"ID":"12f08e48-5775-4ba5-8321-e68eee8fd2c6","Type":"ContainerStarted","Data":"48a8a484b27aeb16fb3a3724a8c600a5c4e6b1c958860372abcfa848fc6f9dd1"} Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.054015 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-8t2kp" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.054684 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qlb5v" podStartSLOduration=3.286034526 podStartE2EDuration="20.054657677s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.103683324 +0000 UTC m=+1108.056048789" lastFinishedPulling="2026-03-07 07:10:27.872306475 +0000 UTC m=+1124.824671940" observedRunningTime="2026-03-07 07:10:29.034296282 +0000 UTC m=+1125.986661747" watchObservedRunningTime="2026-03-07 07:10:29.054657677 +0000 UTC m=+1126.007023142" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.058083 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mv6rg" event={"ID":"16a86642-eb42-44bd-b668-8295e2316f09","Type":"ContainerStarted","Data":"5e909f60ef816b73208320ddfbe4ac7020a0fbbee79d7045cec8b61da204043a"} Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.058256 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mv6rg" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.065588 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-6r26z" event={"ID":"5dee621a-ccf7-486f-9865-fba380e4e1b1","Type":"ContainerStarted","Data":"365682f47bac9d948142ad3dfe3e32dcc1da0b53c9160515db56153a18e22b4a"} Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.065824 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-6r26z" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.069828 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sbpft" podStartSLOduration=3.170667409 podStartE2EDuration="20.069802435s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:10.894935906 +0000 UTC m=+1107.847301371" lastFinishedPulling="2026-03-07 07:10:27.794070892 +0000 UTC m=+1124.746436397" observedRunningTime="2026-03-07 07:10:29.054889523 +0000 UTC m=+1126.007254988" watchObservedRunningTime="2026-03-07 07:10:29.069802435 +0000 UTC m=+1126.022167900" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.077755 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-bctkj" event={"ID":"f2b19678-f6f3-41fb-8534-f0b826b523f2","Type":"ContainerStarted","Data":"b3517d6cade3dee69449fcc5fcc42ce1cc0706d14c60bdd446c04d32e7c6062d"} Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.078388 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-bctkj" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.093616 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-kd7rp" podStartSLOduration=3.556586027 podStartE2EDuration="20.093590154s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.297099649 +0000 UTC m=+1108.249465114" lastFinishedPulling="2026-03-07 07:10:27.834103746 +0000 UTC m=+1124.786469241" observedRunningTime="2026-03-07 07:10:29.075925394 +0000 UTC m=+1126.028290859" watchObservedRunningTime="2026-03-07 07:10:29.093590154 +0000 UTC m=+1126.045955619" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.100752 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-pbkqb" event={"ID":"a1f803c1-f954-4aff-b54e-2baae04f1bbf","Type":"ContainerStarted","Data":"40059e0353285deba53dda98b9cfeef018a0eab2483cfcec1cc51d60d82222a6"} Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.101926 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-pbkqb" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.121158 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mv6rg" podStartSLOduration=3.5762904669999998 podStartE2EDuration="20.121133104s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.288792777 +0000 UTC m=+1108.241158242" lastFinishedPulling="2026-03-07 07:10:27.833635404 +0000 UTC m=+1124.786000879" observedRunningTime="2026-03-07 07:10:29.102267425 +0000 UTC m=+1126.054632890" watchObservedRunningTime="2026-03-07 07:10:29.121133104 +0000 UTC m=+1126.073498569" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.149625 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-8t2kp" podStartSLOduration=3.052137782 podStartE2EDuration="19.149601117s" podCreationTimestamp="2026-03-07 07:10:10 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.737056751 +0000 UTC m=+1108.689422216" lastFinishedPulling="2026-03-07 07:10:27.834520076 +0000 UTC m=+1124.786885551" observedRunningTime="2026-03-07 07:10:29.132644964 +0000 UTC m=+1126.085010429" watchObservedRunningTime="2026-03-07 07:10:29.149601117 +0000 UTC m=+1126.101966582" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.180210 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-6r26z" podStartSLOduration=3.624777266 podStartE2EDuration="20.180186841s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.284203955 +0000 UTC m=+1108.236569420" lastFinishedPulling="2026-03-07 07:10:27.83961353 +0000 UTC m=+1124.791978995" observedRunningTime="2026-03-07 07:10:29.159059707 +0000 UTC m=+1126.111425172" watchObservedRunningTime="2026-03-07 07:10:29.180186841 +0000 UTC m=+1126.132552306" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.211865 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-pbkqb" podStartSLOduration=3.969442531 podStartE2EDuration="20.211839341s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.589539223 +0000 UTC m=+1108.541904688" lastFinishedPulling="2026-03-07 07:10:27.831936023 +0000 UTC m=+1124.784301498" observedRunningTime="2026-03-07 07:10:29.181044031 +0000 UTC m=+1126.133409506" watchObservedRunningTime="2026-03-07 07:10:29.211839341 +0000 UTC m=+1126.164204806" Mar 07 07:10:29 crc kubenswrapper[4941]: I0307 07:10:29.214082 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-bctkj" podStartSLOduration=3.66685643 podStartE2EDuration="20.214074205s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.288793607 +0000 UTC m=+1108.241159072" lastFinishedPulling="2026-03-07 07:10:27.836011342 +0000 UTC m=+1124.788376847" observedRunningTime="2026-03-07 07:10:29.210296813 +0000 UTC m=+1126.162662278" watchObservedRunningTime="2026-03-07 07:10:29.214074205 +0000 UTC m=+1126.166439670" Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.166290 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" event={"ID":"c0b92dab-fef5-4bf2-b07d-f3787dc8060c","Type":"ContainerStarted","Data":"80453347bfd1df7f3563e978e015c2d09d9fa5a1ccba1b04b8c8b7f94f37b8ba"} Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.167385 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.168194 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj" event={"ID":"0dfad6cb-1bbf-4af8-bd06-efc92bfd4347","Type":"ContainerStarted","Data":"90c9a191e5262c9e51415e175661e21f674072dda975763ef2eea0306a0022e1"} Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.168390 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj" Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.169326 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5" event={"ID":"87e61107-4868-497d-a7fa-73f56f084ff2","Type":"ContainerStarted","Data":"92d6a77148544799e9088aa9eae6d9c00d4923589cb200c8b90c764f6e8efca8"} Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.169456 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5" Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.170384 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx" event={"ID":"42861835-d760-4e61-b9c2-cd0f3e3478d8","Type":"ContainerStarted","Data":"ed3792fed76117cbd638cde216dbbf4996f0883b74dcd808a3ecf9e73b612d95"} Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.172834 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5" event={"ID":"5dbee8a7-f5e1-44df-ae39-850574975086","Type":"ContainerStarted","Data":"5679728597f24256e6f55db49d0111c31364d664ef24383cf7006493c5d265de"} Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.173005 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5" Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.174422 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9" event={"ID":"19e5c8c2-a9b7-41be-9a9c-9bc60ddd1478","Type":"ContainerStarted","Data":"55dc689f38160dceb1fc812b5321bf4b70845cfd190a20c786207d35b4da762b"} Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.174618 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9" Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.195071 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" podStartSLOduration=3.681947696 podStartE2EDuration="27.195048513s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.603200335 +0000 UTC m=+1108.555565790" lastFinishedPulling="2026-03-07 07:10:35.116301142 +0000 UTC m=+1132.068666607" observedRunningTime="2026-03-07 07:10:36.189037067 +0000 UTC m=+1133.141402542" watchObservedRunningTime="2026-03-07 07:10:36.195048513 +0000 UTC m=+1133.147413978" Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.211686 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5" podStartSLOduration=3.651480677 podStartE2EDuration="27.211665638s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.608774241 +0000 UTC m=+1108.561139706" lastFinishedPulling="2026-03-07 07:10:35.168959202 +0000 UTC m=+1132.121324667" observedRunningTime="2026-03-07 07:10:36.207196489 +0000 UTC m=+1133.159561964" watchObservedRunningTime="2026-03-07 07:10:36.211665638 +0000 UTC m=+1133.164031113" Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.227217 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj" podStartSLOduration=3.720070254 podStartE2EDuration="27.227198505s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.603714408 +0000 UTC m=+1108.556079873" lastFinishedPulling="2026-03-07 07:10:35.110842659 +0000 UTC m=+1132.063208124" observedRunningTime="2026-03-07 07:10:36.222488651 +0000 UTC m=+1133.174854106" watchObservedRunningTime="2026-03-07 07:10:36.227198505 +0000 UTC m=+1133.179563960" Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.243218 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5" podStartSLOduration=3.7534129849999998 podStartE2EDuration="27.243201935s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.621054999 +0000 UTC m=+1108.573420454" lastFinishedPulling="2026-03-07 07:10:35.110843939 +0000 UTC m=+1132.063209404" observedRunningTime="2026-03-07 07:10:36.238054849 +0000 UTC m=+1133.190420324" watchObservedRunningTime="2026-03-07 07:10:36.243201935 +0000 UTC m=+1133.195567390" Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.254836 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwtfx" podStartSLOduration=2.64221189 podStartE2EDuration="26.254824897s" podCreationTimestamp="2026-03-07 07:10:10 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.604266601 +0000 UTC m=+1108.556632066" lastFinishedPulling="2026-03-07 07:10:35.216879608 +0000 UTC m=+1132.169245073" observedRunningTime="2026-03-07 07:10:36.249319233 +0000 UTC m=+1133.201684698" watchObservedRunningTime="2026-03-07 07:10:36.254824897 +0000 UTC m=+1133.207190362" Mar 07 07:10:36 crc kubenswrapper[4941]: I0307 07:10:36.262919 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9" podStartSLOduration=3.596418997 podStartE2EDuration="27.262908474s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:11.491858707 +0000 UTC m=+1108.444224172" lastFinishedPulling="2026-03-07 07:10:35.158348194 +0000 UTC m=+1132.110713649" observedRunningTime="2026-03-07 07:10:36.262378051 +0000 UTC m=+1133.214743536" watchObservedRunningTime="2026-03-07 07:10:36.262908474 +0000 UTC m=+1133.215273939" Mar 07 07:10:39 crc kubenswrapper[4941]: I0307 07:10:39.816665 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sbpft" Mar 07 07:10:39 crc kubenswrapper[4941]: I0307 07:10:39.876832 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qlb5v" Mar 07 07:10:39 crc kubenswrapper[4941]: I0307 07:10:39.944968 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-bvmsw" Mar 07 07:10:39 crc kubenswrapper[4941]: I0307 07:10:39.969580 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-7wr8g" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.204077 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-t468c" event={"ID":"3df07af4-0aa2-4795-a129-22be2b991b9d","Type":"ContainerStarted","Data":"c043447ad0d19f3ad5932816cd44d860152c79d5005ad1e3219b46efc9eb3c6d"} Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.204986 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-t468c" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.246777 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-kd7rp" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.264019 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-6r26z" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.269507 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-t468c" podStartSLOduration=2.7922211839999997 podStartE2EDuration="31.269487498s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:10.894118626 +0000 UTC m=+1107.846484091" lastFinishedPulling="2026-03-07 07:10:39.37138493 +0000 UTC m=+1136.323750405" observedRunningTime="2026-03-07 07:10:40.232399295 +0000 UTC m=+1137.184764760" watchObservedRunningTime="2026-03-07 07:10:40.269487498 +0000 UTC m=+1137.221852953" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.274628 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mv6rg" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.287255 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-bctkj" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.313792 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.313850 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.351971 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-d7n9x" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.478745 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-g86j9" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.480888 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-xjw4j" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.699197 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.740257 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-6srmj" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.765510 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-pbkqb" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.806038 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-hqpm5" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.823444 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-wtfh5" Mar 07 07:10:40 crc kubenswrapper[4941]: I0307 07:10:40.840511 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-8t2kp" Mar 07 07:10:41 crc kubenswrapper[4941]: I0307 07:10:41.213167 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4" event={"ID":"a698d941-ce95-43c6-9512-1259d85a4cce","Type":"ContainerStarted","Data":"b537a52a6e54952608064f80073156b36a7039ea505342a1771d32aea4fcd0d4"} Mar 07 07:10:41 crc kubenswrapper[4941]: I0307 07:10:41.213514 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4" Mar 07 07:10:41 crc kubenswrapper[4941]: I0307 07:10:41.227431 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4" podStartSLOduration=2.733329891 podStartE2EDuration="32.22738776s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:10.941194441 +0000 UTC m=+1107.893559906" lastFinishedPulling="2026-03-07 07:10:40.43525231 +0000 UTC m=+1137.387617775" observedRunningTime="2026-03-07 07:10:41.227370669 +0000 UTC m=+1138.179736134" watchObservedRunningTime="2026-03-07 07:10:41.22738776 +0000 UTC m=+1138.179753245" Mar 07 07:10:41 crc kubenswrapper[4941]: I0307 07:10:41.744604 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-5lfxq\" (UID: \"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:41 crc kubenswrapper[4941]: I0307 07:10:41.752300 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/466fcef1-3bd0-4fff-8e3b-c5dbea9cad30-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-5lfxq\" (UID: \"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:41 crc kubenswrapper[4941]: I0307 07:10:41.926442 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:42 crc kubenswrapper[4941]: I0307 07:10:42.143288 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq"] Mar 07 07:10:42 crc kubenswrapper[4941]: I0307 07:10:42.155362 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn\" (UID: \"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:42 crc kubenswrapper[4941]: W0307 07:10:42.156904 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod466fcef1_3bd0_4fff_8e3b_c5dbea9cad30.slice/crio-52da4b63edf8d179257173b89dfbddccd9181cb4c1a514c7155c6b5de330f3a9 WatchSource:0}: Error finding container 52da4b63edf8d179257173b89dfbddccd9181cb4c1a514c7155c6b5de330f3a9: Status 404 returned error can't find the container with id 52da4b63edf8d179257173b89dfbddccd9181cb4c1a514c7155c6b5de330f3a9 Mar 07 07:10:42 crc kubenswrapper[4941]: I0307 07:10:42.166657 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63bae88f-5e2e-4e53-9e5e-e7d31ca511d1-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn\" (UID: \"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:42 crc kubenswrapper[4941]: I0307 07:10:42.220559 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" event={"ID":"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30","Type":"ContainerStarted","Data":"52da4b63edf8d179257173b89dfbddccd9181cb4c1a514c7155c6b5de330f3a9"} Mar 07 07:10:42 crc kubenswrapper[4941]: I0307 07:10:42.344006 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:42 crc kubenswrapper[4941]: I0307 07:10:42.360077 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:42 crc kubenswrapper[4941]: I0307 07:10:42.360146 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:42 crc kubenswrapper[4941]: I0307 07:10:42.364452 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:42 crc kubenswrapper[4941]: I0307 07:10:42.366492 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90ee94e8-276b-476f-a6b6-4729bbd5fab3-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-9pblb\" (UID: \"90ee94e8-276b-476f-a6b6-4729bbd5fab3\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:42 crc kubenswrapper[4941]: I0307 07:10:42.375158 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:42 crc kubenswrapper[4941]: I0307 07:10:42.602783 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn"] Mar 07 07:10:42 crc kubenswrapper[4941]: I0307 07:10:42.860203 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb"] Mar 07 07:10:42 crc kubenswrapper[4941]: W0307 07:10:42.869590 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90ee94e8_276b_476f_a6b6_4729bbd5fab3.slice/crio-b822fe18dab35a1b88bd4b471e577fa8d8989bacabea4ed7fe2705af8df15aac WatchSource:0}: Error finding container b822fe18dab35a1b88bd4b471e577fa8d8989bacabea4ed7fe2705af8df15aac: Status 404 returned error can't find the container with id b822fe18dab35a1b88bd4b471e577fa8d8989bacabea4ed7fe2705af8df15aac Mar 07 07:10:43 crc kubenswrapper[4941]: I0307 07:10:43.236329 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" event={"ID":"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1","Type":"ContainerStarted","Data":"8677c6bfee77b215482928738cc9baa65d157389f74e514198d5aeef92c6b9a9"} Mar 07 07:10:43 crc kubenswrapper[4941]: I0307 07:10:43.241353 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" event={"ID":"90ee94e8-276b-476f-a6b6-4729bbd5fab3","Type":"ContainerStarted","Data":"0977861ecd82a25ea69ff9ff8b6f3dafcf9edfb7dfb408a3f0e224e2090b1abc"} Mar 07 07:10:43 crc kubenswrapper[4941]: I0307 07:10:43.241378 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" event={"ID":"90ee94e8-276b-476f-a6b6-4729bbd5fab3","Type":"ContainerStarted","Data":"b822fe18dab35a1b88bd4b471e577fa8d8989bacabea4ed7fe2705af8df15aac"} Mar 07 07:10:43 crc kubenswrapper[4941]: I0307 07:10:43.242431 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:10:43 crc kubenswrapper[4941]: I0307 07:10:43.274015 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" podStartSLOduration=33.273994465 podStartE2EDuration="33.273994465s" podCreationTimestamp="2026-03-07 07:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:10:43.270239673 +0000 UTC m=+1140.222605138" watchObservedRunningTime="2026-03-07 07:10:43.273994465 +0000 UTC m=+1140.226359930" Mar 07 07:10:45 crc kubenswrapper[4941]: I0307 07:10:45.256864 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" event={"ID":"63bae88f-5e2e-4e53-9e5e-e7d31ca511d1","Type":"ContainerStarted","Data":"2f73b229a340501db50ad5d33624933da9a97d793e2fc844368cf733b5e92136"} Mar 07 07:10:45 crc kubenswrapper[4941]: I0307 07:10:45.257371 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:45 crc kubenswrapper[4941]: I0307 07:10:45.259077 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" event={"ID":"466fcef1-3bd0-4fff-8e3b-c5dbea9cad30","Type":"ContainerStarted","Data":"f6ea05c6ca5ffd9c1ce1df88ce7cc37dec96c23493361759e3972bf07a70552b"} Mar 07 07:10:45 crc kubenswrapper[4941]: I0307 07:10:45.292418 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" podStartSLOduration=34.100310329 podStartE2EDuration="36.292375274s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:42.612959374 +0000 UTC m=+1139.565324839" lastFinishedPulling="2026-03-07 07:10:44.805024319 +0000 UTC m=+1141.757389784" observedRunningTime="2026-03-07 07:10:45.286316557 +0000 UTC m=+1142.238682022" watchObservedRunningTime="2026-03-07 07:10:45.292375274 +0000 UTC m=+1142.244740739" Mar 07 07:10:45 crc kubenswrapper[4941]: I0307 07:10:45.313700 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" podStartSLOduration=33.678540189 podStartE2EDuration="36.313676702s" podCreationTimestamp="2026-03-07 07:10:09 +0000 UTC" firstStartedPulling="2026-03-07 07:10:42.161133833 +0000 UTC m=+1139.113499298" lastFinishedPulling="2026-03-07 07:10:44.796270346 +0000 UTC m=+1141.748635811" observedRunningTime="2026-03-07 07:10:45.308456855 +0000 UTC m=+1142.260822320" watchObservedRunningTime="2026-03-07 07:10:45.313676702 +0000 UTC m=+1142.266042167" Mar 07 07:10:46 crc kubenswrapper[4941]: I0307 07:10:46.267328 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:49 crc kubenswrapper[4941]: I0307 07:10:49.869871 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-t468c" Mar 07 07:10:49 crc kubenswrapper[4941]: I0307 07:10:49.925546 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-8qzr4" Mar 07 07:10:51 crc kubenswrapper[4941]: I0307 07:10:51.935526 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-5lfxq" Mar 07 07:10:52 crc kubenswrapper[4941]: I0307 07:10:52.355346 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn" Mar 07 07:10:52 crc kubenswrapper[4941]: I0307 07:10:52.385475 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-9pblb" Mar 07 07:11:02 crc kubenswrapper[4941]: I0307 07:11:02.804062 4941 scope.go:117] "RemoveContainer" containerID="6ed1b923f2ae62ed2ce73a78f3fb6f447a9d2f292591ac196920f850bba10c79" Mar 07 07:11:06 crc kubenswrapper[4941]: I0307 07:11:06.969503 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-nvlxg"] Mar 07 07:11:06 crc kubenswrapper[4941]: I0307 07:11:06.977273 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-nvlxg" Mar 07 07:11:06 crc kubenswrapper[4941]: I0307 07:11:06.979946 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 07 07:11:06 crc kubenswrapper[4941]: I0307 07:11:06.979964 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 07 07:11:06 crc kubenswrapper[4941]: I0307 07:11:06.980257 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 07 07:11:06 crc kubenswrapper[4941]: I0307 07:11:06.980397 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rl5ht" Mar 07 07:11:06 crc kubenswrapper[4941]: I0307 07:11:06.982176 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk469\" (UniqueName: \"kubernetes.io/projected/39785f82-43bd-4676-89f3-048b89076a7b-kube-api-access-rk469\") pod \"dnsmasq-dns-589db6c89c-nvlxg\" (UID: \"39785f82-43bd-4676-89f3-048b89076a7b\") " pod="openstack/dnsmasq-dns-589db6c89c-nvlxg" Mar 07 07:11:06 crc kubenswrapper[4941]: I0307 07:11:06.982588 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39785f82-43bd-4676-89f3-048b89076a7b-config\") pod \"dnsmasq-dns-589db6c89c-nvlxg\" (UID: \"39785f82-43bd-4676-89f3-048b89076a7b\") " pod="openstack/dnsmasq-dns-589db6c89c-nvlxg" Mar 07 07:11:06 crc kubenswrapper[4941]: I0307 07:11:06.993779 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-nvlxg"] Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.029677 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-jfw56"] Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.036650 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.039135 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.042277 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-jfw56"] Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.084011 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39785f82-43bd-4676-89f3-048b89076a7b-config\") pod \"dnsmasq-dns-589db6c89c-nvlxg\" (UID: \"39785f82-43bd-4676-89f3-048b89076a7b\") " pod="openstack/dnsmasq-dns-589db6c89c-nvlxg" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.084370 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk469\" (UniqueName: \"kubernetes.io/projected/39785f82-43bd-4676-89f3-048b89076a7b-kube-api-access-rk469\") pod \"dnsmasq-dns-589db6c89c-nvlxg\" (UID: \"39785f82-43bd-4676-89f3-048b89076a7b\") " pod="openstack/dnsmasq-dns-589db6c89c-nvlxg" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.084983 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39785f82-43bd-4676-89f3-048b89076a7b-config\") pod \"dnsmasq-dns-589db6c89c-nvlxg\" (UID: \"39785f82-43bd-4676-89f3-048b89076a7b\") " pod="openstack/dnsmasq-dns-589db6c89c-nvlxg" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.100964 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk469\" (UniqueName: \"kubernetes.io/projected/39785f82-43bd-4676-89f3-048b89076a7b-kube-api-access-rk469\") pod \"dnsmasq-dns-589db6c89c-nvlxg\" (UID: \"39785f82-43bd-4676-89f3-048b89076a7b\") " pod="openstack/dnsmasq-dns-589db6c89c-nvlxg" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.185455 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66c707fd-21a8-44db-89d7-96aafe64bdb8-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-jfw56\" (UID: \"66c707fd-21a8-44db-89d7-96aafe64bdb8\") " pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.185717 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kd6q\" (UniqueName: \"kubernetes.io/projected/66c707fd-21a8-44db-89d7-96aafe64bdb8-kube-api-access-2kd6q\") pod \"dnsmasq-dns-86bbd886cf-jfw56\" (UID: \"66c707fd-21a8-44db-89d7-96aafe64bdb8\") " pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.185837 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c707fd-21a8-44db-89d7-96aafe64bdb8-config\") pod \"dnsmasq-dns-86bbd886cf-jfw56\" (UID: \"66c707fd-21a8-44db-89d7-96aafe64bdb8\") " pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.287000 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66c707fd-21a8-44db-89d7-96aafe64bdb8-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-jfw56\" (UID: \"66c707fd-21a8-44db-89d7-96aafe64bdb8\") " pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.287094 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kd6q\" (UniqueName: \"kubernetes.io/projected/66c707fd-21a8-44db-89d7-96aafe64bdb8-kube-api-access-2kd6q\") pod \"dnsmasq-dns-86bbd886cf-jfw56\" (UID: \"66c707fd-21a8-44db-89d7-96aafe64bdb8\") " pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.287130 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c707fd-21a8-44db-89d7-96aafe64bdb8-config\") pod \"dnsmasq-dns-86bbd886cf-jfw56\" (UID: \"66c707fd-21a8-44db-89d7-96aafe64bdb8\") " pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.288044 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c707fd-21a8-44db-89d7-96aafe64bdb8-config\") pod \"dnsmasq-dns-86bbd886cf-jfw56\" (UID: \"66c707fd-21a8-44db-89d7-96aafe64bdb8\") " pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.288622 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66c707fd-21a8-44db-89d7-96aafe64bdb8-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-jfw56\" (UID: \"66c707fd-21a8-44db-89d7-96aafe64bdb8\") " pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.306755 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-nvlxg" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.310312 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kd6q\" (UniqueName: \"kubernetes.io/projected/66c707fd-21a8-44db-89d7-96aafe64bdb8-kube-api-access-2kd6q\") pod \"dnsmasq-dns-86bbd886cf-jfw56\" (UID: \"66c707fd-21a8-44db-89d7-96aafe64bdb8\") " pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.361310 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.724083 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-nvlxg"] Mar 07 07:11:07 crc kubenswrapper[4941]: W0307 07:11:07.735309 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39785f82_43bd_4676_89f3_048b89076a7b.slice/crio-bbe7bc003197c19994e255f886171ac28f8954bac7bad2ae4e6efab692a7715d WatchSource:0}: Error finding container bbe7bc003197c19994e255f886171ac28f8954bac7bad2ae4e6efab692a7715d: Status 404 returned error can't find the container with id bbe7bc003197c19994e255f886171ac28f8954bac7bad2ae4e6efab692a7715d Mar 07 07:11:07 crc kubenswrapper[4941]: I0307 07:11:07.941294 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-jfw56"] Mar 07 07:11:07 crc kubenswrapper[4941]: W0307 07:11:07.950146 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66c707fd_21a8_44db_89d7_96aafe64bdb8.slice/crio-ccdbae8821ffbfff01c0a0531c04c0a437f159533961a0fbac0798a822dbb77f WatchSource:0}: Error finding container ccdbae8821ffbfff01c0a0531c04c0a437f159533961a0fbac0798a822dbb77f: Status 404 returned error can't find the container with id ccdbae8821ffbfff01c0a0531c04c0a437f159533961a0fbac0798a822dbb77f Mar 07 07:11:08 crc kubenswrapper[4941]: I0307 07:11:08.444370 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" event={"ID":"66c707fd-21a8-44db-89d7-96aafe64bdb8","Type":"ContainerStarted","Data":"ccdbae8821ffbfff01c0a0531c04c0a437f159533961a0fbac0798a822dbb77f"} Mar 07 07:11:08 crc kubenswrapper[4941]: I0307 07:11:08.447717 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-nvlxg" event={"ID":"39785f82-43bd-4676-89f3-048b89076a7b","Type":"ContainerStarted","Data":"bbe7bc003197c19994e255f886171ac28f8954bac7bad2ae4e6efab692a7715d"} Mar 07 07:11:09 crc kubenswrapper[4941]: I0307 07:11:09.849220 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-nvlxg"] Mar 07 07:11:09 crc kubenswrapper[4941]: I0307 07:11:09.864808 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-q97z7"] Mar 07 07:11:09 crc kubenswrapper[4941]: I0307 07:11:09.867042 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:09 crc kubenswrapper[4941]: I0307 07:11:09.878103 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-q97z7"] Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.027486 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbt5r\" (UniqueName: \"kubernetes.io/projected/06b1fd5a-3d20-401e-969e-661d72270c2c-kube-api-access-mbt5r\") pod \"dnsmasq-dns-78cb4465c9-q97z7\" (UID: \"06b1fd5a-3d20-401e-969e-661d72270c2c\") " pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.027953 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06b1fd5a-3d20-401e-969e-661d72270c2c-config\") pod \"dnsmasq-dns-78cb4465c9-q97z7\" (UID: \"06b1fd5a-3d20-401e-969e-661d72270c2c\") " pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.028013 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06b1fd5a-3d20-401e-969e-661d72270c2c-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-q97z7\" (UID: \"06b1fd5a-3d20-401e-969e-661d72270c2c\") " pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.116771 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-jfw56"] Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.129149 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06b1fd5a-3d20-401e-969e-661d72270c2c-config\") pod \"dnsmasq-dns-78cb4465c9-q97z7\" (UID: \"06b1fd5a-3d20-401e-969e-661d72270c2c\") " pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.129206 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06b1fd5a-3d20-401e-969e-661d72270c2c-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-q97z7\" (UID: \"06b1fd5a-3d20-401e-969e-661d72270c2c\") " pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.129304 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbt5r\" (UniqueName: \"kubernetes.io/projected/06b1fd5a-3d20-401e-969e-661d72270c2c-kube-api-access-mbt5r\") pod \"dnsmasq-dns-78cb4465c9-q97z7\" (UID: \"06b1fd5a-3d20-401e-969e-661d72270c2c\") " pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.131935 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06b1fd5a-3d20-401e-969e-661d72270c2c-config\") pod \"dnsmasq-dns-78cb4465c9-q97z7\" (UID: \"06b1fd5a-3d20-401e-969e-661d72270c2c\") " pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.131978 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06b1fd5a-3d20-401e-969e-661d72270c2c-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-q97z7\" (UID: \"06b1fd5a-3d20-401e-969e-661d72270c2c\") " pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.136162 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-7bc8b"] Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.137187 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.171473 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbt5r\" (UniqueName: \"kubernetes.io/projected/06b1fd5a-3d20-401e-969e-661d72270c2c-kube-api-access-mbt5r\") pod \"dnsmasq-dns-78cb4465c9-q97z7\" (UID: \"06b1fd5a-3d20-401e-969e-661d72270c2c\") " pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.200505 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-7bc8b"] Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.207968 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.314265 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.314323 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.332217 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b677cb4e-34de-4c2e-a9b9-507597162fa4-config\") pod \"dnsmasq-dns-7c47bcb9f9-7bc8b\" (UID: \"b677cb4e-34de-4c2e-a9b9-507597162fa4\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.332272 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b677cb4e-34de-4c2e-a9b9-507597162fa4-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-7bc8b\" (UID: \"b677cb4e-34de-4c2e-a9b9-507597162fa4\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.332322 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7m72\" (UniqueName: \"kubernetes.io/projected/b677cb4e-34de-4c2e-a9b9-507597162fa4-kube-api-access-s7m72\") pod \"dnsmasq-dns-7c47bcb9f9-7bc8b\" (UID: \"b677cb4e-34de-4c2e-a9b9-507597162fa4\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.436098 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7m72\" (UniqueName: \"kubernetes.io/projected/b677cb4e-34de-4c2e-a9b9-507597162fa4-kube-api-access-s7m72\") pod \"dnsmasq-dns-7c47bcb9f9-7bc8b\" (UID: \"b677cb4e-34de-4c2e-a9b9-507597162fa4\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.436368 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b677cb4e-34de-4c2e-a9b9-507597162fa4-config\") pod \"dnsmasq-dns-7c47bcb9f9-7bc8b\" (UID: \"b677cb4e-34de-4c2e-a9b9-507597162fa4\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.436390 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b677cb4e-34de-4c2e-a9b9-507597162fa4-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-7bc8b\" (UID: \"b677cb4e-34de-4c2e-a9b9-507597162fa4\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.437538 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b677cb4e-34de-4c2e-a9b9-507597162fa4-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-7bc8b\" (UID: \"b677cb4e-34de-4c2e-a9b9-507597162fa4\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.441256 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b677cb4e-34de-4c2e-a9b9-507597162fa4-config\") pod \"dnsmasq-dns-7c47bcb9f9-7bc8b\" (UID: \"b677cb4e-34de-4c2e-a9b9-507597162fa4\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.454587 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7m72\" (UniqueName: \"kubernetes.io/projected/b677cb4e-34de-4c2e-a9b9-507597162fa4-kube-api-access-s7m72\") pod \"dnsmasq-dns-7c47bcb9f9-7bc8b\" (UID: \"b677cb4e-34de-4c2e-a9b9-507597162fa4\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.461950 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.530870 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-q97z7"] Mar 07 07:11:10 crc kubenswrapper[4941]: W0307 07:11:10.540924 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06b1fd5a_3d20_401e_969e_661d72270c2c.slice/crio-0e34ff17a46821c72c80075884c386c253a7066bbb45740bdc10c55f32b4af15 WatchSource:0}: Error finding container 0e34ff17a46821c72c80075884c386c253a7066bbb45740bdc10c55f32b4af15: Status 404 returned error can't find the container with id 0e34ff17a46821c72c80075884c386c253a7066bbb45740bdc10c55f32b4af15 Mar 07 07:11:10 crc kubenswrapper[4941]: I0307 07:11:10.895305 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-7bc8b"] Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.027185 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.028484 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.038306 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.038426 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.040734 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.040919 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.041077 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.041294 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rnz26" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.041479 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.044285 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.146736 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.147015 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.147044 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nvs7\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-kube-api-access-6nvs7\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.147067 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.147106 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.147178 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.147245 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.147313 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.147345 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.147497 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.147542 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.250930 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.251025 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.251083 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.251147 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.251171 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nvs7\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-kube-api-access-6nvs7\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.251203 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.251247 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.251340 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.251370 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.251419 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.251458 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.252064 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.252432 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.252858 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.255016 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.256063 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.256289 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.265128 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.273618 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.284576 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.289298 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.289362 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.305216 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.308691 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nvs7\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-kube-api-access-6nvs7\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.310758 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.311017 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.311123 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.311264 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.311362 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.311699 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-x642k" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.311872 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.314266 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.332747 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.363016 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.454762 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.454830 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.454859 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.454896 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.454915 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.454938 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.455004 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.455025 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pdqw\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-kube-api-access-7pdqw\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.455049 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.455074 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.455096 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.473471 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" event={"ID":"06b1fd5a-3d20-401e-969e-661d72270c2c","Type":"ContainerStarted","Data":"0e34ff17a46821c72c80075884c386c253a7066bbb45740bdc10c55f32b4af15"} Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.556757 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.556820 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.556851 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.556884 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.556914 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.556936 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.556967 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.556982 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.557002 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.557034 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.557058 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdqw\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-kube-api-access-7pdqw\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.557898 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.558621 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.558748 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.559129 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.559323 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.559876 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.561359 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.561611 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.561659 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.562748 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.576880 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pdqw\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-kube-api-access-7pdqw\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.579682 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " pod="openstack/rabbitmq-server-0" Mar 07 07:11:11 crc kubenswrapper[4941]: I0307 07:11:11.703457 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.398020 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.399160 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.403384 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.403452 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-d4fqc" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.403699 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.404960 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.408663 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.413446 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.468111 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-kolla-config\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.468187 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.468211 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1fb4667-396e-44bb-a2ed-e576a9b69be2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.468257 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.468287 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjnw5\" (UniqueName: \"kubernetes.io/projected/b1fb4667-396e-44bb-a2ed-e576a9b69be2-kube-api-access-jjnw5\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.468337 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b1fb4667-396e-44bb-a2ed-e576a9b69be2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.468419 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1fb4667-396e-44bb-a2ed-e576a9b69be2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.468456 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-config-data-default\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.570542 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1fb4667-396e-44bb-a2ed-e576a9b69be2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.570955 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-config-data-default\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.571038 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-kolla-config\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.571132 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.571174 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1fb4667-396e-44bb-a2ed-e576a9b69be2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.571302 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.571350 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjnw5\" (UniqueName: \"kubernetes.io/projected/b1fb4667-396e-44bb-a2ed-e576a9b69be2-kube-api-access-jjnw5\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.571464 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b1fb4667-396e-44bb-a2ed-e576a9b69be2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.571779 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.573482 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b1fb4667-396e-44bb-a2ed-e576a9b69be2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.579674 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1fb4667-396e-44bb-a2ed-e576a9b69be2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.580628 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-config-data-default\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.594186 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1fb4667-396e-44bb-a2ed-e576a9b69be2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.597143 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-kolla-config\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.598290 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.600542 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.603770 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjnw5\" (UniqueName: \"kubernetes.io/projected/b1fb4667-396e-44bb-a2ed-e576a9b69be2-kube-api-access-jjnw5\") pod \"openstack-galera-0\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " pod="openstack/openstack-galera-0" Mar 07 07:11:12 crc kubenswrapper[4941]: I0307 07:11:12.731850 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.812928 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.818232 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.821198 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-jnmxl" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.821723 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.822133 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.824632 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.827524 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.897857 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.897926 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.897969 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.898005 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.898027 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxv7p\" (UniqueName: \"kubernetes.io/projected/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-kube-api-access-rxv7p\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.898049 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.898066 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.898088 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.999171 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.999249 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.999293 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:13 crc kubenswrapper[4941]: I0307 07:11:13.999350 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.000185 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.000748 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.000823 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.001499 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.001556 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxv7p\" (UniqueName: \"kubernetes.io/projected/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-kube-api-access-rxv7p\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.001585 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.001631 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.001979 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.003280 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.009310 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.018206 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.048026 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxv7p\" (UniqueName: \"kubernetes.io/projected/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-kube-api-access-rxv7p\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.060749 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.071597 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.073453 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.076159 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.079242 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.081833 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-68hm5" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.084746 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.144864 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.204196 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b22d449-d1ec-4bf4-a876-b86a87508580-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.204281 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6wt\" (UniqueName: \"kubernetes.io/projected/5b22d449-d1ec-4bf4-a876-b86a87508580-kube-api-access-qt6wt\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.204350 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5b22d449-d1ec-4bf4-a876-b86a87508580-kolla-config\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.204371 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b22d449-d1ec-4bf4-a876-b86a87508580-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.204389 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b22d449-d1ec-4bf4-a876-b86a87508580-config-data\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.305592 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6wt\" (UniqueName: \"kubernetes.io/projected/5b22d449-d1ec-4bf4-a876-b86a87508580-kube-api-access-qt6wt\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.305684 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5b22d449-d1ec-4bf4-a876-b86a87508580-kolla-config\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.305721 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b22d449-d1ec-4bf4-a876-b86a87508580-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.305744 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b22d449-d1ec-4bf4-a876-b86a87508580-config-data\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.305800 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b22d449-d1ec-4bf4-a876-b86a87508580-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.306881 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5b22d449-d1ec-4bf4-a876-b86a87508580-kolla-config\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.306899 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b22d449-d1ec-4bf4-a876-b86a87508580-config-data\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.309845 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b22d449-d1ec-4bf4-a876-b86a87508580-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.310487 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b22d449-d1ec-4bf4-a876-b86a87508580-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.321315 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6wt\" (UniqueName: \"kubernetes.io/projected/5b22d449-d1ec-4bf4-a876-b86a87508580-kube-api-access-qt6wt\") pod \"memcached-0\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.447168 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 07 07:11:14 crc kubenswrapper[4941]: I0307 07:11:14.516867 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" event={"ID":"b677cb4e-34de-4c2e-a9b9-507597162fa4","Type":"ContainerStarted","Data":"6d7313e1258e97c687796c2ac717292a880c1c691e67dde6766bef71701087a5"} Mar 07 07:11:16 crc kubenswrapper[4941]: I0307 07:11:16.174964 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 07 07:11:16 crc kubenswrapper[4941]: I0307 07:11:16.176440 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 07 07:11:16 crc kubenswrapper[4941]: I0307 07:11:16.180063 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-gj7gt" Mar 07 07:11:16 crc kubenswrapper[4941]: I0307 07:11:16.191510 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 07 07:11:16 crc kubenswrapper[4941]: I0307 07:11:16.231694 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt88n\" (UniqueName: \"kubernetes.io/projected/7b8305a8-370d-4b70-8807-e0188603429f-kube-api-access-rt88n\") pod \"kube-state-metrics-0\" (UID: \"7b8305a8-370d-4b70-8807-e0188603429f\") " pod="openstack/kube-state-metrics-0" Mar 07 07:11:16 crc kubenswrapper[4941]: I0307 07:11:16.333075 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt88n\" (UniqueName: \"kubernetes.io/projected/7b8305a8-370d-4b70-8807-e0188603429f-kube-api-access-rt88n\") pod \"kube-state-metrics-0\" (UID: \"7b8305a8-370d-4b70-8807-e0188603429f\") " pod="openstack/kube-state-metrics-0" Mar 07 07:11:16 crc kubenswrapper[4941]: I0307 07:11:16.351836 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt88n\" (UniqueName: \"kubernetes.io/projected/7b8305a8-370d-4b70-8807-e0188603429f-kube-api-access-rt88n\") pod \"kube-state-metrics-0\" (UID: \"7b8305a8-370d-4b70-8807-e0188603429f\") " pod="openstack/kube-state-metrics-0" Mar 07 07:11:16 crc kubenswrapper[4941]: I0307 07:11:16.491961 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.299976 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.303203 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.308720 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.309196 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.309477 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-t7dxh" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.310692 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.311780 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.326254 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.352307 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-x7fq9"] Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.353684 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.357237 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-8ljwz" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.357551 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.357742 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.365112 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x7fq9"] Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.403639 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmskj\" (UniqueName: \"kubernetes.io/projected/5f4f0d58-e159-427f-8cca-95525d4968cd-kube-api-access-qmskj\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.403702 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-run\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.403860 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-log-ovn\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.404169 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-run-ovn\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.404334 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxgdr\" (UniqueName: \"kubernetes.io/projected/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-kube-api-access-hxgdr\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.404430 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4f0d58-e159-427f-8cca-95525d4968cd-combined-ca-bundle\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.404746 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-config\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.404783 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.404810 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f4f0d58-e159-427f-8cca-95525d4968cd-ovn-controller-tls-certs\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.404882 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f4f0d58-e159-427f-8cca-95525d4968cd-scripts\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.404994 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.405335 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.406774 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.406854 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.406877 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.441248 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-vrr7t"] Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.444517 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.455543 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vrr7t"] Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.508337 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509386 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-run\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509433 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509465 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509491 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-etc-ovs\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509511 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmskj\" (UniqueName: \"kubernetes.io/projected/5f4f0d58-e159-427f-8cca-95525d4968cd-kube-api-access-qmskj\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509536 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-log-ovn\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509559 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-log\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509582 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxgdr\" (UniqueName: \"kubernetes.io/projected/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-kube-api-access-hxgdr\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509599 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4f0d58-e159-427f-8cca-95525d4968cd-combined-ca-bundle\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509619 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8k4m\" (UniqueName: \"kubernetes.io/projected/531af2a1-d934-48a5-b3de-61d475bf252f-kube-api-access-m8k4m\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509652 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509670 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f4f0d58-e159-427f-8cca-95525d4968cd-ovn-controller-tls-certs\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509687 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/531af2a1-d934-48a5-b3de-61d475bf252f-scripts\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509715 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509739 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509759 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-run\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509776 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-lib\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509820 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-run-ovn\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509860 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-config\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.509878 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f4f0d58-e159-427f-8cca-95525d4968cd-scripts\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.510012 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.510339 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.510797 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-log-ovn\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.511997 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f4f0d58-e159-427f-8cca-95525d4968cd-scripts\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.512597 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-run\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.512706 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-run-ovn\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.513373 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-config\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.515496 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.516866 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.525058 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.525172 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.527020 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmskj\" (UniqueName: \"kubernetes.io/projected/5f4f0d58-e159-427f-8cca-95525d4968cd-kube-api-access-qmskj\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.529057 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxgdr\" (UniqueName: \"kubernetes.io/projected/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-kube-api-access-hxgdr\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.534083 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4f0d58-e159-427f-8cca-95525d4968cd-combined-ca-bundle\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.535797 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.537008 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f4f0d58-e159-427f-8cca-95525d4968cd-ovn-controller-tls-certs\") pod \"ovn-controller-x7fq9\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.611059 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-run\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.611125 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-etc-ovs\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.611167 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-log\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.611199 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8k4m\" (UniqueName: \"kubernetes.io/projected/531af2a1-d934-48a5-b3de-61d475bf252f-kube-api-access-m8k4m\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.611236 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/531af2a1-d934-48a5-b3de-61d475bf252f-scripts\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.611274 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-lib\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.611441 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-etc-ovs\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.611200 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-run\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.611640 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-lib\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.611775 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-log\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.613313 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/531af2a1-d934-48a5-b3de-61d475bf252f-scripts\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.626681 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.629833 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8k4m\" (UniqueName: \"kubernetes.io/projected/531af2a1-d934-48a5-b3de-61d475bf252f-kube-api-access-m8k4m\") pod \"ovn-controller-ovs-vrr7t\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.670593 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:20 crc kubenswrapper[4941]: I0307 07:11:20.769543 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:22 crc kubenswrapper[4941]: E0307 07:11:22.300783 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 07 07:11:22 crc kubenswrapper[4941]: E0307 07:11:22.301272 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2kd6q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-jfw56_openstack(66c707fd-21a8-44db-89d7-96aafe64bdb8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 07 07:11:22 crc kubenswrapper[4941]: E0307 07:11:22.302458 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" podUID="66c707fd-21a8-44db-89d7-96aafe64bdb8" Mar 07 07:11:22 crc kubenswrapper[4941]: E0307 07:11:22.357613 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 07 07:11:22 crc kubenswrapper[4941]: E0307 07:11:22.357749 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rk469,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-589db6c89c-nvlxg_openstack(39785f82-43bd-4676-89f3-048b89076a7b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 07 07:11:22 crc kubenswrapper[4941]: E0307 07:11:22.359095 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-589db6c89c-nvlxg" podUID="39785f82-43bd-4676-89f3-048b89076a7b" Mar 07 07:11:22 crc kubenswrapper[4941]: I0307 07:11:22.744792 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 07 07:11:22 crc kubenswrapper[4941]: I0307 07:11:22.970229 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 07 07:11:22 crc kubenswrapper[4941]: W0307 07:11:22.974238 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeb1dd04_5b8c_49b4_bf65_be38fb8ae670.slice/crio-f35c2527ef2f2e6926df80b3a68dfacaba9b01bed657fc4d575095c5fe323e92 WatchSource:0}: Error finding container f35c2527ef2f2e6926df80b3a68dfacaba9b01bed657fc4d575095c5fe323e92: Status 404 returned error can't find the container with id f35c2527ef2f2e6926df80b3a68dfacaba9b01bed657fc4d575095c5fe323e92 Mar 07 07:11:22 crc kubenswrapper[4941]: I0307 07:11:22.992731 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.045711 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.050236 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x7fq9"] Mar 07 07:11:23 crc kubenswrapper[4941]: W0307 07:11:23.055074 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b8305a8_370d_4b70_8807_e0188603429f.slice/crio-c6a9c58d2981c0f4c12aff61b0a5c960da7136612c3a3ff3f65574c179cd7352 WatchSource:0}: Error finding container c6a9c58d2981c0f4c12aff61b0a5c960da7136612c3a3ff3f65574c179cd7352: Status 404 returned error can't find the container with id c6a9c58d2981c0f4c12aff61b0a5c960da7136612c3a3ff3f65574c179cd7352 Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.055985 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 07 07:11:23 crc kubenswrapper[4941]: W0307 07:11:23.063626 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f4f0d58_e159_427f_8cca_95525d4968cd.slice/crio-b5bcf5103d089cc5d3fe178719c8c36baaced33e41f54fc4d1efd0a62b22182b WatchSource:0}: Error finding container b5bcf5103d089cc5d3fe178719c8c36baaced33e41f54fc4d1efd0a62b22182b: Status 404 returned error can't find the container with id b5bcf5103d089cc5d3fe178719c8c36baaced33e41f54fc4d1efd0a62b22182b Mar 07 07:11:23 crc kubenswrapper[4941]: W0307 07:11:23.072200 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b22d449_d1ec_4bf4_a876_b86a87508580.slice/crio-cf6eeac298dd15c7159f49dece5dd7887b60604bebe300e1521ba43743b01426 WatchSource:0}: Error finding container cf6eeac298dd15c7159f49dece5dd7887b60604bebe300e1521ba43743b01426: Status 404 returned error can't find the container with id cf6eeac298dd15c7159f49dece5dd7887b60604bebe300e1521ba43743b01426 Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.076174 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-nvlxg" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.076395 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.080812 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.160180 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c707fd-21a8-44db-89d7-96aafe64bdb8-config\") pod \"66c707fd-21a8-44db-89d7-96aafe64bdb8\" (UID: \"66c707fd-21a8-44db-89d7-96aafe64bdb8\") " Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.160257 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kd6q\" (UniqueName: \"kubernetes.io/projected/66c707fd-21a8-44db-89d7-96aafe64bdb8-kube-api-access-2kd6q\") pod \"66c707fd-21a8-44db-89d7-96aafe64bdb8\" (UID: \"66c707fd-21a8-44db-89d7-96aafe64bdb8\") " Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.160360 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66c707fd-21a8-44db-89d7-96aafe64bdb8-dns-svc\") pod \"66c707fd-21a8-44db-89d7-96aafe64bdb8\" (UID: \"66c707fd-21a8-44db-89d7-96aafe64bdb8\") " Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.160484 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39785f82-43bd-4676-89f3-048b89076a7b-config\") pod \"39785f82-43bd-4676-89f3-048b89076a7b\" (UID: \"39785f82-43bd-4676-89f3-048b89076a7b\") " Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.160549 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk469\" (UniqueName: \"kubernetes.io/projected/39785f82-43bd-4676-89f3-048b89076a7b-kube-api-access-rk469\") pod \"39785f82-43bd-4676-89f3-048b89076a7b\" (UID: \"39785f82-43bd-4676-89f3-048b89076a7b\") " Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.160852 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66c707fd-21a8-44db-89d7-96aafe64bdb8-config" (OuterVolumeSpecName: "config") pod "66c707fd-21a8-44db-89d7-96aafe64bdb8" (UID: "66c707fd-21a8-44db-89d7-96aafe64bdb8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.161303 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c707fd-21a8-44db-89d7-96aafe64bdb8-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.161464 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66c707fd-21a8-44db-89d7-96aafe64bdb8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "66c707fd-21a8-44db-89d7-96aafe64bdb8" (UID: "66c707fd-21a8-44db-89d7-96aafe64bdb8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.161794 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39785f82-43bd-4676-89f3-048b89076a7b-config" (OuterVolumeSpecName: "config") pod "39785f82-43bd-4676-89f3-048b89076a7b" (UID: "39785f82-43bd-4676-89f3-048b89076a7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.165644 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c707fd-21a8-44db-89d7-96aafe64bdb8-kube-api-access-2kd6q" (OuterVolumeSpecName: "kube-api-access-2kd6q") pod "66c707fd-21a8-44db-89d7-96aafe64bdb8" (UID: "66c707fd-21a8-44db-89d7-96aafe64bdb8"). InnerVolumeSpecName "kube-api-access-2kd6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.166346 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39785f82-43bd-4676-89f3-048b89076a7b-kube-api-access-rk469" (OuterVolumeSpecName: "kube-api-access-rk469") pod "39785f82-43bd-4676-89f3-048b89076a7b" (UID: "39785f82-43bd-4676-89f3-048b89076a7b"). InnerVolumeSpecName "kube-api-access-rk469". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.262854 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66c707fd-21a8-44db-89d7-96aafe64bdb8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.262893 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39785f82-43bd-4676-89f3-048b89076a7b-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.262917 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk469\" (UniqueName: \"kubernetes.io/projected/39785f82-43bd-4676-89f3-048b89076a7b-kube-api-access-rk469\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.262932 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kd6q\" (UniqueName: \"kubernetes.io/projected/66c707fd-21a8-44db-89d7-96aafe64bdb8-kube-api-access-2kd6q\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.281118 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 07 07:11:23 crc kubenswrapper[4941]: W0307 07:11:23.284224 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec9ebffe_8c04_481b_a187_bcdcca1a49a9.slice/crio-36f36c5e43282180ca4da99230a4eda143f25603a3890cbfb7db330759e9942b WatchSource:0}: Error finding container 36f36c5e43282180ca4da99230a4eda143f25603a3890cbfb7db330759e9942b: Status 404 returned error can't find the container with id 36f36c5e43282180ca4da99230a4eda143f25603a3890cbfb7db330759e9942b Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.604751 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5b22d449-d1ec-4bf4-a876-b86a87508580","Type":"ContainerStarted","Data":"cf6eeac298dd15c7159f49dece5dd7887b60604bebe300e1521ba43743b01426"} Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.608370 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ec9ebffe-8c04-481b-a187-bcdcca1a49a9","Type":"ContainerStarted","Data":"36f36c5e43282180ca4da99230a4eda143f25603a3890cbfb7db330759e9942b"} Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.610846 4941 generic.go:334] "Generic (PLEG): container finished" podID="b677cb4e-34de-4c2e-a9b9-507597162fa4" containerID="1ac2a08ce8397c02e977b41a27ecbafc7f3363e7afebad69f337aaa3258115e1" exitCode=0 Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.610927 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" event={"ID":"b677cb4e-34de-4c2e-a9b9-507597162fa4","Type":"ContainerDied","Data":"1ac2a08ce8397c02e977b41a27ecbafc7f3363e7afebad69f337aaa3258115e1"} Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.612768 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7b8305a8-370d-4b70-8807-e0188603429f","Type":"ContainerStarted","Data":"c6a9c58d2981c0f4c12aff61b0a5c960da7136612c3a3ff3f65574c179cd7352"} Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.616968 4941 generic.go:334] "Generic (PLEG): container finished" podID="06b1fd5a-3d20-401e-969e-661d72270c2c" containerID="9699935b314b0362d96896407e0a378761d192d98575668571aa9ae294bb3554" exitCode=0 Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.617027 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" event={"ID":"06b1fd5a-3d20-401e-969e-661d72270c2c","Type":"ContainerDied","Data":"9699935b314b0362d96896407e0a378761d192d98575668571aa9ae294bb3554"} Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.645092 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670","Type":"ContainerStarted","Data":"f35c2527ef2f2e6926df80b3a68dfacaba9b01bed657fc4d575095c5fe323e92"} Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.649008 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b1fb4667-396e-44bb-a2ed-e576a9b69be2","Type":"ContainerStarted","Data":"18d307fa7f9fdee1b297c14a672ca1ecc03942f9702524fffd3029aac8976ab3"} Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.653533 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-nvlxg" event={"ID":"39785f82-43bd-4676-89f3-048b89076a7b","Type":"ContainerDied","Data":"bbe7bc003197c19994e255f886171ac28f8954bac7bad2ae4e6efab692a7715d"} Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.653548 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-nvlxg" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.655284 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7fq9" event={"ID":"5f4f0d58-e159-427f-8cca-95525d4968cd","Type":"ContainerStarted","Data":"b5bcf5103d089cc5d3fe178719c8c36baaced33e41f54fc4d1efd0a62b22182b"} Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.656915 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" event={"ID":"66c707fd-21a8-44db-89d7-96aafe64bdb8","Type":"ContainerDied","Data":"ccdbae8821ffbfff01c0a0531c04c0a437f159533961a0fbac0798a822dbb77f"} Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.656937 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-jfw56" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.658125 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3963d293-d9e9-44b6-b0a5-b1532b4a0a31","Type":"ContainerStarted","Data":"d103828190b73cd66b52e3c87baabe0626b7b15848c46ec255fb053470608b21"} Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.659247 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7b306e38-c479-45ff-93ab-ca0e0e6a3aef","Type":"ContainerStarted","Data":"fc62b7217c793fd339905be7b9764f09d0143de2cf0406cb41b1edbc4b3a0fb6"} Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.744873 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-nvlxg"] Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.755474 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-nvlxg"] Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.763002 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.764458 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.767841 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8bkhd" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.768579 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.768792 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.768952 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.806358 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-jfw56"] Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.816003 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-jfw56"] Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.835778 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.885197 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.885254 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32aba6f1-c08f-4826-8492-9f2979275f5e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.885512 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/32aba6f1-c08f-4826-8492-9f2979275f5e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.885603 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.885646 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.885672 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32aba6f1-c08f-4826-8492-9f2979275f5e-config\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.885698 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzp4m\" (UniqueName: \"kubernetes.io/projected/32aba6f1-c08f-4826-8492-9f2979275f5e-kube-api-access-fzp4m\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.885779 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.973412 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39785f82-43bd-4676-89f3-048b89076a7b" path="/var/lib/kubelet/pods/39785f82-43bd-4676-89f3-048b89076a7b/volumes" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.973998 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c707fd-21a8-44db-89d7-96aafe64bdb8" path="/var/lib/kubelet/pods/66c707fd-21a8-44db-89d7-96aafe64bdb8/volumes" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.989069 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/32aba6f1-c08f-4826-8492-9f2979275f5e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.989166 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.989200 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.989264 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32aba6f1-c08f-4826-8492-9f2979275f5e-config\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.989328 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzp4m\" (UniqueName: \"kubernetes.io/projected/32aba6f1-c08f-4826-8492-9f2979275f5e-kube-api-access-fzp4m\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.989445 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.989522 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.989563 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.989592 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32aba6f1-c08f-4826-8492-9f2979275f5e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.990146 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32aba6f1-c08f-4826-8492-9f2979275f5e-config\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:23 crc kubenswrapper[4941]: I0307 07:11:23.991645 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32aba6f1-c08f-4826-8492-9f2979275f5e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:24 crc kubenswrapper[4941]: I0307 07:11:23.995495 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:24 crc kubenswrapper[4941]: I0307 07:11:23.997686 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:24 crc kubenswrapper[4941]: I0307 07:11:24.009631 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:24 crc kubenswrapper[4941]: I0307 07:11:24.012831 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/32aba6f1-c08f-4826-8492-9f2979275f5e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:24 crc kubenswrapper[4941]: I0307 07:11:24.033806 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:24 crc kubenswrapper[4941]: I0307 07:11:24.034582 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzp4m\" (UniqueName: \"kubernetes.io/projected/32aba6f1-c08f-4826-8492-9f2979275f5e-kube-api-access-fzp4m\") pod \"ovsdbserver-sb-0\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:24 crc kubenswrapper[4941]: I0307 07:11:24.082316 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:24 crc kubenswrapper[4941]: I0307 07:11:24.351695 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vrr7t"] Mar 07 07:11:24 crc kubenswrapper[4941]: W0307 07:11:24.705015 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod531af2a1_d934_48a5_b3de_61d475bf252f.slice/crio-49c6452edd07d78c7f49664743d4192b47a0b8dc53d1b2b4f64e29bc0dbd0010 WatchSource:0}: Error finding container 49c6452edd07d78c7f49664743d4192b47a0b8dc53d1b2b4f64e29bc0dbd0010: Status 404 returned error can't find the container with id 49c6452edd07d78c7f49664743d4192b47a0b8dc53d1b2b4f64e29bc0dbd0010 Mar 07 07:11:25 crc kubenswrapper[4941]: I0307 07:11:25.700809 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" event={"ID":"b677cb4e-34de-4c2e-a9b9-507597162fa4","Type":"ContainerStarted","Data":"2968ed0672bf074989ea855c6a0d259bc1a63da30856401b4384899242b4ff0b"} Mar 07 07:11:25 crc kubenswrapper[4941]: I0307 07:11:25.700863 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:25 crc kubenswrapper[4941]: I0307 07:11:25.703916 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vrr7t" event={"ID":"531af2a1-d934-48a5-b3de-61d475bf252f","Type":"ContainerStarted","Data":"49c6452edd07d78c7f49664743d4192b47a0b8dc53d1b2b4f64e29bc0dbd0010"} Mar 07 07:11:25 crc kubenswrapper[4941]: I0307 07:11:25.720702 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" podStartSLOduration=6.964400124 podStartE2EDuration="15.720680748s" podCreationTimestamp="2026-03-07 07:11:10 +0000 UTC" firstStartedPulling="2026-03-07 07:11:13.620959902 +0000 UTC m=+1170.573325367" lastFinishedPulling="2026-03-07 07:11:22.377240526 +0000 UTC m=+1179.329605991" observedRunningTime="2026-03-07 07:11:25.717414258 +0000 UTC m=+1182.669779723" watchObservedRunningTime="2026-03-07 07:11:25.720680748 +0000 UTC m=+1182.673046213" Mar 07 07:11:30 crc kubenswrapper[4941]: I0307 07:11:30.464367 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:30 crc kubenswrapper[4941]: I0307 07:11:30.527111 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-q97z7"] Mar 07 07:11:31 crc kubenswrapper[4941]: I0307 07:11:31.752131 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" event={"ID":"06b1fd5a-3d20-401e-969e-661d72270c2c","Type":"ContainerStarted","Data":"1ef1904662270698027c0db239a5f5a81d24f4457a8ba6cdee65c6d06f31b1a4"} Mar 07 07:11:31 crc kubenswrapper[4941]: I0307 07:11:31.752500 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" podUID="06b1fd5a-3d20-401e-969e-661d72270c2c" containerName="dnsmasq-dns" containerID="cri-o://1ef1904662270698027c0db239a5f5a81d24f4457a8ba6cdee65c6d06f31b1a4" gracePeriod=10 Mar 07 07:11:31 crc kubenswrapper[4941]: I0307 07:11:31.752989 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:31 crc kubenswrapper[4941]: I0307 07:11:31.784855 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" podStartSLOduration=10.954253776 podStartE2EDuration="22.784826624s" podCreationTimestamp="2026-03-07 07:11:09 +0000 UTC" firstStartedPulling="2026-03-07 07:11:10.542301481 +0000 UTC m=+1167.494666946" lastFinishedPulling="2026-03-07 07:11:22.372874329 +0000 UTC m=+1179.325239794" observedRunningTime="2026-03-07 07:11:31.778181603 +0000 UTC m=+1188.730547088" watchObservedRunningTime="2026-03-07 07:11:31.784826624 +0000 UTC m=+1188.737192109" Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.239518 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 07 07:11:32 crc kubenswrapper[4941]: W0307 07:11:32.567216 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32aba6f1_c08f_4826_8492_9f2979275f5e.slice/crio-fcafe69f9d33cdcdd767b891dba420467db99d4b7347ab7c0732bdb8bd09d45a WatchSource:0}: Error finding container fcafe69f9d33cdcdd767b891dba420467db99d4b7347ab7c0732bdb8bd09d45a: Status 404 returned error can't find the container with id fcafe69f9d33cdcdd767b891dba420467db99d4b7347ab7c0732bdb8bd09d45a Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.722301 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.784333 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"32aba6f1-c08f-4826-8492-9f2979275f5e","Type":"ContainerStarted","Data":"fcafe69f9d33cdcdd767b891dba420467db99d4b7347ab7c0732bdb8bd09d45a"} Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.786880 4941 generic.go:334] "Generic (PLEG): container finished" podID="06b1fd5a-3d20-401e-969e-661d72270c2c" containerID="1ef1904662270698027c0db239a5f5a81d24f4457a8ba6cdee65c6d06f31b1a4" exitCode=0 Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.786917 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" event={"ID":"06b1fd5a-3d20-401e-969e-661d72270c2c","Type":"ContainerDied","Data":"1ef1904662270698027c0db239a5f5a81d24f4457a8ba6cdee65c6d06f31b1a4"} Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.786940 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" event={"ID":"06b1fd5a-3d20-401e-969e-661d72270c2c","Type":"ContainerDied","Data":"0e34ff17a46821c72c80075884c386c253a7066bbb45740bdc10c55f32b4af15"} Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.786966 4941 scope.go:117] "RemoveContainer" containerID="1ef1904662270698027c0db239a5f5a81d24f4457a8ba6cdee65c6d06f31b1a4" Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.787094 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-q97z7" Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.835621 4941 scope.go:117] "RemoveContainer" containerID="9699935b314b0362d96896407e0a378761d192d98575668571aa9ae294bb3554" Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.887053 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06b1fd5a-3d20-401e-969e-661d72270c2c-dns-svc\") pod \"06b1fd5a-3d20-401e-969e-661d72270c2c\" (UID: \"06b1fd5a-3d20-401e-969e-661d72270c2c\") " Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.887319 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbt5r\" (UniqueName: \"kubernetes.io/projected/06b1fd5a-3d20-401e-969e-661d72270c2c-kube-api-access-mbt5r\") pod \"06b1fd5a-3d20-401e-969e-661d72270c2c\" (UID: \"06b1fd5a-3d20-401e-969e-661d72270c2c\") " Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.887367 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06b1fd5a-3d20-401e-969e-661d72270c2c-config\") pod \"06b1fd5a-3d20-401e-969e-661d72270c2c\" (UID: \"06b1fd5a-3d20-401e-969e-661d72270c2c\") " Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.981815 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b1fd5a-3d20-401e-969e-661d72270c2c-kube-api-access-mbt5r" (OuterVolumeSpecName: "kube-api-access-mbt5r") pod "06b1fd5a-3d20-401e-969e-661d72270c2c" (UID: "06b1fd5a-3d20-401e-969e-661d72270c2c"). InnerVolumeSpecName "kube-api-access-mbt5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:11:32 crc kubenswrapper[4941]: I0307 07:11:32.988695 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbt5r\" (UniqueName: \"kubernetes.io/projected/06b1fd5a-3d20-401e-969e-661d72270c2c-kube-api-access-mbt5r\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.146897 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06b1fd5a-3d20-401e-969e-661d72270c2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06b1fd5a-3d20-401e-969e-661d72270c2c" (UID: "06b1fd5a-3d20-401e-969e-661d72270c2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.160222 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06b1fd5a-3d20-401e-969e-661d72270c2c-config" (OuterVolumeSpecName: "config") pod "06b1fd5a-3d20-401e-969e-661d72270c2c" (UID: "06b1fd5a-3d20-401e-969e-661d72270c2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.191596 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06b1fd5a-3d20-401e-969e-661d72270c2c-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.191633 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06b1fd5a-3d20-401e-969e-661d72270c2c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.231035 4941 scope.go:117] "RemoveContainer" containerID="1ef1904662270698027c0db239a5f5a81d24f4457a8ba6cdee65c6d06f31b1a4" Mar 07 07:11:33 crc kubenswrapper[4941]: E0307 07:11:33.231499 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef1904662270698027c0db239a5f5a81d24f4457a8ba6cdee65c6d06f31b1a4\": container with ID starting with 1ef1904662270698027c0db239a5f5a81d24f4457a8ba6cdee65c6d06f31b1a4 not found: ID does not exist" containerID="1ef1904662270698027c0db239a5f5a81d24f4457a8ba6cdee65c6d06f31b1a4" Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.231555 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef1904662270698027c0db239a5f5a81d24f4457a8ba6cdee65c6d06f31b1a4"} err="failed to get container status \"1ef1904662270698027c0db239a5f5a81d24f4457a8ba6cdee65c6d06f31b1a4\": rpc error: code = NotFound desc = could not find container \"1ef1904662270698027c0db239a5f5a81d24f4457a8ba6cdee65c6d06f31b1a4\": container with ID starting with 1ef1904662270698027c0db239a5f5a81d24f4457a8ba6cdee65c6d06f31b1a4 not found: ID does not exist" Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.231599 4941 scope.go:117] "RemoveContainer" containerID="9699935b314b0362d96896407e0a378761d192d98575668571aa9ae294bb3554" Mar 07 07:11:33 crc kubenswrapper[4941]: E0307 07:11:33.232063 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9699935b314b0362d96896407e0a378761d192d98575668571aa9ae294bb3554\": container with ID starting with 9699935b314b0362d96896407e0a378761d192d98575668571aa9ae294bb3554 not found: ID does not exist" containerID="9699935b314b0362d96896407e0a378761d192d98575668571aa9ae294bb3554" Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.232108 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9699935b314b0362d96896407e0a378761d192d98575668571aa9ae294bb3554"} err="failed to get container status \"9699935b314b0362d96896407e0a378761d192d98575668571aa9ae294bb3554\": rpc error: code = NotFound desc = could not find container \"9699935b314b0362d96896407e0a378761d192d98575668571aa9ae294bb3554\": container with ID starting with 9699935b314b0362d96896407e0a378761d192d98575668571aa9ae294bb3554 not found: ID does not exist" Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.420941 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-q97z7"] Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.426535 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-q97z7"] Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.807770 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7b306e38-c479-45ff-93ab-ca0e0e6a3aef","Type":"ContainerStarted","Data":"dc55c3da137f47538c0bd3c217f221d45ba4d3f6188abc6966831c84aa69682d"} Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.810788 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5b22d449-d1ec-4bf4-a876-b86a87508580","Type":"ContainerStarted","Data":"db1a9f99b81bb6ecb54ee4a7546b073acc7305f1c4b4f070a77885466d64e5f6"} Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.810894 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.814348 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b1fb4667-396e-44bb-a2ed-e576a9b69be2","Type":"ContainerStarted","Data":"0a897e9d6eb2dfe5274f7722bda82f3f429420fd16b4565875bd08cbdfefaab3"} Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.816183 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7fq9" event={"ID":"5f4f0d58-e159-427f-8cca-95525d4968cd","Type":"ContainerStarted","Data":"4d3c4f1cb725db64e33869898d4504671134f70b853278179133c55e028ed533"} Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.816308 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-x7fq9" Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.820149 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ec9ebffe-8c04-481b-a187-bcdcca1a49a9","Type":"ContainerStarted","Data":"ea2fc562d127ab81fb4b03a280065325184993b1b4cb50e32728a5bc194bdcad"} Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.914851 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-x7fq9" podStartSLOduration=5.027402772 podStartE2EDuration="13.914832328s" podCreationTimestamp="2026-03-07 07:11:20 +0000 UTC" firstStartedPulling="2026-03-07 07:11:23.066893122 +0000 UTC m=+1180.019258577" lastFinishedPulling="2026-03-07 07:11:31.954322668 +0000 UTC m=+1188.906688133" observedRunningTime="2026-03-07 07:11:33.913895996 +0000 UTC m=+1190.866261481" watchObservedRunningTime="2026-03-07 07:11:33.914832328 +0000 UTC m=+1190.867197793" Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.945282 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.418170288 podStartE2EDuration="19.945258488s" podCreationTimestamp="2026-03-07 07:11:14 +0000 UTC" firstStartedPulling="2026-03-07 07:11:23.07461194 +0000 UTC m=+1180.026977395" lastFinishedPulling="2026-03-07 07:11:31.60170013 +0000 UTC m=+1188.554065595" observedRunningTime="2026-03-07 07:11:33.930576341 +0000 UTC m=+1190.882941806" watchObservedRunningTime="2026-03-07 07:11:33.945258488 +0000 UTC m=+1190.897623953" Mar 07 07:11:33 crc kubenswrapper[4941]: I0307 07:11:33.971086 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b1fd5a-3d20-401e-969e-661d72270c2c" path="/var/lib/kubelet/pods/06b1fd5a-3d20-401e-969e-661d72270c2c/volumes" Mar 07 07:11:34 crc kubenswrapper[4941]: I0307 07:11:34.835308 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vrr7t" event={"ID":"531af2a1-d934-48a5-b3de-61d475bf252f","Type":"ContainerDied","Data":"d7cbec1f305f6ebf20138346c84c10958909e785b858213cbc4a4dfa6f01f912"} Mar 07 07:11:34 crc kubenswrapper[4941]: I0307 07:11:34.835334 4941 generic.go:334] "Generic (PLEG): container finished" podID="531af2a1-d934-48a5-b3de-61d475bf252f" containerID="d7cbec1f305f6ebf20138346c84c10958909e785b858213cbc4a4dfa6f01f912" exitCode=0 Mar 07 07:11:34 crc kubenswrapper[4941]: I0307 07:11:34.840397 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7b8305a8-370d-4b70-8807-e0188603429f","Type":"ContainerStarted","Data":"d9198001c28a9e1be27e0ffe75994ebbba017f278186889f54463acfe87c9367"} Mar 07 07:11:34 crc kubenswrapper[4941]: I0307 07:11:34.840490 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 07 07:11:34 crc kubenswrapper[4941]: I0307 07:11:34.842243 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3963d293-d9e9-44b6-b0a5-b1532b4a0a31","Type":"ContainerStarted","Data":"c2e50e54d812cc28e53051fff65c9444476dee98164deae3a30d1ddc3f4e4e86"} Mar 07 07:11:34 crc kubenswrapper[4941]: I0307 07:11:34.844792 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670","Type":"ContainerStarted","Data":"a12662163e378ef0047a7d0c3ffc76b2214269655c2741ec69ff5a13c078ddf4"} Mar 07 07:11:34 crc kubenswrapper[4941]: I0307 07:11:34.850407 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"32aba6f1-c08f-4826-8492-9f2979275f5e","Type":"ContainerStarted","Data":"be456d1e5f8553ef85166e01f835ac78767170cdd6c5ad60cdfc7756602760a6"} Mar 07 07:11:34 crc kubenswrapper[4941]: I0307 07:11:34.871809 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.23852338 podStartE2EDuration="18.871789977s" podCreationTimestamp="2026-03-07 07:11:16 +0000 UTC" firstStartedPulling="2026-03-07 07:11:23.062473025 +0000 UTC m=+1180.014838490" lastFinishedPulling="2026-03-07 07:11:32.695739632 +0000 UTC m=+1189.648105087" observedRunningTime="2026-03-07 07:11:34.870390023 +0000 UTC m=+1191.822755498" watchObservedRunningTime="2026-03-07 07:11:34.871789977 +0000 UTC m=+1191.824155432" Mar 07 07:11:39 crc kubenswrapper[4941]: I0307 07:11:39.448848 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 07 07:11:40 crc kubenswrapper[4941]: I0307 07:11:40.313481 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:11:40 crc kubenswrapper[4941]: I0307 07:11:40.313826 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:11:40 crc kubenswrapper[4941]: I0307 07:11:40.313865 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 07:11:40 crc kubenswrapper[4941]: I0307 07:11:40.314772 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b286093b0ba04c8db409f2f8003244d432459a7a31a64eb7ee6e534880ca523"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:11:40 crc kubenswrapper[4941]: I0307 07:11:40.314883 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://8b286093b0ba04c8db409f2f8003244d432459a7a31a64eb7ee6e534880ca523" gracePeriod=600 Mar 07 07:11:40 crc kubenswrapper[4941]: E0307 07:11:40.475471 4941 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250d2c0d_993b_466a_a5e0_bacae5fe8df5.slice/crio-8b286093b0ba04c8db409f2f8003244d432459a7a31a64eb7ee6e534880ca523.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250d2c0d_993b_466a_a5e0_bacae5fe8df5.slice/crio-conmon-8b286093b0ba04c8db409f2f8003244d432459a7a31a64eb7ee6e534880ca523.scope\": RecentStats: unable to find data in memory cache]" Mar 07 07:11:40 crc kubenswrapper[4941]: I0307 07:11:40.895599 4941 generic.go:334] "Generic (PLEG): container finished" podID="b1fb4667-396e-44bb-a2ed-e576a9b69be2" containerID="0a897e9d6eb2dfe5274f7722bda82f3f429420fd16b4565875bd08cbdfefaab3" exitCode=0 Mar 07 07:11:40 crc kubenswrapper[4941]: I0307 07:11:40.895990 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b1fb4667-396e-44bb-a2ed-e576a9b69be2","Type":"ContainerDied","Data":"0a897e9d6eb2dfe5274f7722bda82f3f429420fd16b4565875bd08cbdfefaab3"} Mar 07 07:11:40 crc kubenswrapper[4941]: I0307 07:11:40.900743 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="8b286093b0ba04c8db409f2f8003244d432459a7a31a64eb7ee6e534880ca523" exitCode=0 Mar 07 07:11:40 crc kubenswrapper[4941]: I0307 07:11:40.901054 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"8b286093b0ba04c8db409f2f8003244d432459a7a31a64eb7ee6e534880ca523"} Mar 07 07:11:40 crc kubenswrapper[4941]: I0307 07:11:40.901121 4941 scope.go:117] "RemoveContainer" containerID="81c89fa64b6b91f6338e8315cd83a021b0214053cc3ad130bb16369071ad3bcf" Mar 07 07:11:40 crc kubenswrapper[4941]: I0307 07:11:40.907766 4941 generic.go:334] "Generic (PLEG): container finished" podID="7b306e38-c479-45ff-93ab-ca0e0e6a3aef" containerID="dc55c3da137f47538c0bd3c217f221d45ba4d3f6188abc6966831c84aa69682d" exitCode=0 Mar 07 07:11:40 crc kubenswrapper[4941]: I0307 07:11:40.907807 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7b306e38-c479-45ff-93ab-ca0e0e6a3aef","Type":"ContainerDied","Data":"dc55c3da137f47538c0bd3c217f221d45ba4d3f6188abc6966831c84aa69682d"} Mar 07 07:11:41 crc kubenswrapper[4941]: I0307 07:11:41.928005 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"32aba6f1-c08f-4826-8492-9f2979275f5e","Type":"ContainerStarted","Data":"067c949e419323cc676040b4fd78ae141b059399cfde952987dfd044a128e4f0"} Mar 07 07:11:41 crc kubenswrapper[4941]: I0307 07:11:41.932189 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vrr7t" event={"ID":"531af2a1-d934-48a5-b3de-61d475bf252f","Type":"ContainerStarted","Data":"bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e"} Mar 07 07:11:41 crc kubenswrapper[4941]: I0307 07:11:41.932225 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vrr7t" event={"ID":"531af2a1-d934-48a5-b3de-61d475bf252f","Type":"ContainerStarted","Data":"cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd"} Mar 07 07:11:41 crc kubenswrapper[4941]: I0307 07:11:41.932850 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:41 crc kubenswrapper[4941]: I0307 07:11:41.932887 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:11:41 crc kubenswrapper[4941]: I0307 07:11:41.942745 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7b306e38-c479-45ff-93ab-ca0e0e6a3aef","Type":"ContainerStarted","Data":"4b79935f1592e509f7cf336d7d435662bd63d7104ffb2d6d1825034c8387696c"} Mar 07 07:11:41 crc kubenswrapper[4941]: I0307 07:11:41.945799 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b1fb4667-396e-44bb-a2ed-e576a9b69be2","Type":"ContainerStarted","Data":"7f685a4108caaa4e36108f8a55c285a49cde2d7ba1b6c62533d0207627b03aa3"} Mar 07 07:11:41 crc kubenswrapper[4941]: I0307 07:11:41.950532 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ec9ebffe-8c04-481b-a187-bcdcca1a49a9","Type":"ContainerStarted","Data":"0fdfa5c28298504762261e59ae8634154d32cea145ca796274ab84812c8beeb1"} Mar 07 07:11:41 crc kubenswrapper[4941]: I0307 07:11:41.959572 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.667016969 podStartE2EDuration="19.959549653s" podCreationTimestamp="2026-03-07 07:11:22 +0000 UTC" firstStartedPulling="2026-03-07 07:11:32.569950062 +0000 UTC m=+1189.522315537" lastFinishedPulling="2026-03-07 07:11:40.862482756 +0000 UTC m=+1197.814848221" observedRunningTime="2026-03-07 07:11:41.95162388 +0000 UTC m=+1198.903989365" watchObservedRunningTime="2026-03-07 07:11:41.959549653 +0000 UTC m=+1198.911915148" Mar 07 07:11:41 crc kubenswrapper[4941]: I0307 07:11:41.974114 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"275b7664a9752e3935f384cd42fa92a626ebe7af03267d645869fc5e152276f5"} Mar 07 07:11:42 crc kubenswrapper[4941]: I0307 07:11:42.010602 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-vrr7t" podStartSLOduration=14.762910209 podStartE2EDuration="22.010586655s" podCreationTimestamp="2026-03-07 07:11:20 +0000 UTC" firstStartedPulling="2026-03-07 07:11:24.707378459 +0000 UTC m=+1181.659743924" lastFinishedPulling="2026-03-07 07:11:31.955054905 +0000 UTC m=+1188.907420370" observedRunningTime="2026-03-07 07:11:42.006821573 +0000 UTC m=+1198.959187048" watchObservedRunningTime="2026-03-07 07:11:42.010586655 +0000 UTC m=+1198.962952120" Mar 07 07:11:42 crc kubenswrapper[4941]: I0307 07:11:42.014455 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.42334263 podStartE2EDuration="23.014443429s" podCreationTimestamp="2026-03-07 07:11:19 +0000 UTC" firstStartedPulling="2026-03-07 07:11:23.286100375 +0000 UTC m=+1180.238465840" lastFinishedPulling="2026-03-07 07:11:40.877201184 +0000 UTC m=+1197.829566639" observedRunningTime="2026-03-07 07:11:41.988222461 +0000 UTC m=+1198.940587966" watchObservedRunningTime="2026-03-07 07:11:42.014443429 +0000 UTC m=+1198.966808894" Mar 07 07:11:42 crc kubenswrapper[4941]: I0307 07:11:42.034171 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.081286951 podStartE2EDuration="31.034151368s" podCreationTimestamp="2026-03-07 07:11:11 +0000 UTC" firstStartedPulling="2026-03-07 07:11:23.001468301 +0000 UTC m=+1179.953833766" lastFinishedPulling="2026-03-07 07:11:31.954332718 +0000 UTC m=+1188.906698183" observedRunningTime="2026-03-07 07:11:42.023862528 +0000 UTC m=+1198.976227993" watchObservedRunningTime="2026-03-07 07:11:42.034151368 +0000 UTC m=+1198.986516833" Mar 07 07:11:42 crc kubenswrapper[4941]: I0307 07:11:42.047221 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.539114123 podStartE2EDuration="30.047201755s" podCreationTimestamp="2026-03-07 07:11:12 +0000 UTC" firstStartedPulling="2026-03-07 07:11:23.06184505 +0000 UTC m=+1180.014210515" lastFinishedPulling="2026-03-07 07:11:32.569932682 +0000 UTC m=+1189.522298147" observedRunningTime="2026-03-07 07:11:42.043553227 +0000 UTC m=+1198.995918702" watchObservedRunningTime="2026-03-07 07:11:42.047201755 +0000 UTC m=+1198.999567220" Mar 07 07:11:42 crc kubenswrapper[4941]: I0307 07:11:42.082909 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:42 crc kubenswrapper[4941]: I0307 07:11:42.140925 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:42 crc kubenswrapper[4941]: I0307 07:11:42.732669 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 07 07:11:42 crc kubenswrapper[4941]: I0307 07:11:42.733054 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 07 07:11:42 crc kubenswrapper[4941]: I0307 07:11:42.971104 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.034036 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.320923 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-mgrsv"] Mar 07 07:11:43 crc kubenswrapper[4941]: E0307 07:11:43.322183 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b1fd5a-3d20-401e-969e-661d72270c2c" containerName="init" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.322212 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b1fd5a-3d20-401e-969e-661d72270c2c" containerName="init" Mar 07 07:11:43 crc kubenswrapper[4941]: E0307 07:11:43.322253 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b1fd5a-3d20-401e-969e-661d72270c2c" containerName="dnsmasq-dns" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.322265 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b1fd5a-3d20-401e-969e-661d72270c2c" containerName="dnsmasq-dns" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.322462 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b1fd5a-3d20-401e-969e-661d72270c2c" containerName="dnsmasq-dns" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.323488 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.326105 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.349125 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-mgrsv"] Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.363477 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-config\") pod \"dnsmasq-dns-795cf8b45c-mgrsv\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.363523 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx5h4\" (UniqueName: \"kubernetes.io/projected/03d5d76f-08c8-429f-b661-b51290a50767-kube-api-access-bx5h4\") pod \"dnsmasq-dns-795cf8b45c-mgrsv\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.363610 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-dns-svc\") pod \"dnsmasq-dns-795cf8b45c-mgrsv\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.363720 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-ovsdbserver-sb\") pod \"dnsmasq-dns-795cf8b45c-mgrsv\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.420329 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8fkwt"] Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.421315 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.423324 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.432783 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8fkwt"] Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.465246 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-dns-svc\") pod \"dnsmasq-dns-795cf8b45c-mgrsv\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.465287 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-ovsdbserver-sb\") pod \"dnsmasq-dns-795cf8b45c-mgrsv\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.465311 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88ccdd50-0997-4e6e-9e05-3555379221a0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.465345 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ccdd50-0997-4e6e-9e05-3555379221a0-combined-ca-bundle\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.465421 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-config\") pod \"dnsmasq-dns-795cf8b45c-mgrsv\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.465506 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx5h4\" (UniqueName: \"kubernetes.io/projected/03d5d76f-08c8-429f-b661-b51290a50767-kube-api-access-bx5h4\") pod \"dnsmasq-dns-795cf8b45c-mgrsv\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.465612 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88ccdd50-0997-4e6e-9e05-3555379221a0-config\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.465659 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88ccdd50-0997-4e6e-9e05-3555379221a0-ovs-rundir\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.465679 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88ccdd50-0997-4e6e-9e05-3555379221a0-ovn-rundir\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.465740 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dh27\" (UniqueName: \"kubernetes.io/projected/88ccdd50-0997-4e6e-9e05-3555379221a0-kube-api-access-2dh27\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.466176 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-config\") pod \"dnsmasq-dns-795cf8b45c-mgrsv\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.466195 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-dns-svc\") pod \"dnsmasq-dns-795cf8b45c-mgrsv\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.468265 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-ovsdbserver-sb\") pod \"dnsmasq-dns-795cf8b45c-mgrsv\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.486761 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx5h4\" (UniqueName: \"kubernetes.io/projected/03d5d76f-08c8-429f-b661-b51290a50767-kube-api-access-bx5h4\") pod \"dnsmasq-dns-795cf8b45c-mgrsv\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.567114 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88ccdd50-0997-4e6e-9e05-3555379221a0-config\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.567162 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88ccdd50-0997-4e6e-9e05-3555379221a0-ovs-rundir\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.567178 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88ccdd50-0997-4e6e-9e05-3555379221a0-ovn-rundir\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.567207 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dh27\" (UniqueName: \"kubernetes.io/projected/88ccdd50-0997-4e6e-9e05-3555379221a0-kube-api-access-2dh27\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.567239 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88ccdd50-0997-4e6e-9e05-3555379221a0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.567265 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ccdd50-0997-4e6e-9e05-3555379221a0-combined-ca-bundle\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.567497 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88ccdd50-0997-4e6e-9e05-3555379221a0-ovs-rundir\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.567819 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88ccdd50-0997-4e6e-9e05-3555379221a0-ovn-rundir\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.567960 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88ccdd50-0997-4e6e-9e05-3555379221a0-config\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.571030 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ccdd50-0997-4e6e-9e05-3555379221a0-combined-ca-bundle\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.571043 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88ccdd50-0997-4e6e-9e05-3555379221a0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.584674 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dh27\" (UniqueName: \"kubernetes.io/projected/88ccdd50-0997-4e6e-9e05-3555379221a0-kube-api-access-2dh27\") pod \"ovn-controller-metrics-8fkwt\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.648787 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.723723 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-mgrsv"] Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.738070 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.750330 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-nbmdr"] Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.751530 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.754052 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.773993 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-nbmdr"] Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.873274 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm6rs\" (UniqueName: \"kubernetes.io/projected/8c76297a-be16-4716-9ae8-f7ed93c764b3-kube-api-access-rm6rs\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.873321 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-config\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.873360 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.873523 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.873612 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.976067 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.976147 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-config\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.976168 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm6rs\" (UniqueName: \"kubernetes.io/projected/8c76297a-be16-4716-9ae8-f7ed93c764b3-kube-api-access-rm6rs\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.976201 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.976252 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.977109 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.977968 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.980741 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.982314 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-config\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.988214 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:43 crc kubenswrapper[4941]: I0307 07:11:43.996616 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm6rs\" (UniqueName: \"kubernetes.io/projected/8c76297a-be16-4716-9ae8-f7ed93c764b3-kube-api-access-rm6rs\") pod \"dnsmasq-dns-7b57d9888c-nbmdr\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:44 crc kubenswrapper[4941]: I0307 07:11:44.086840 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:44 crc kubenswrapper[4941]: I0307 07:11:44.146310 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:44 crc kubenswrapper[4941]: I0307 07:11:44.146382 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:44 crc kubenswrapper[4941]: I0307 07:11:44.193040 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-mgrsv"] Mar 07 07:11:44 crc kubenswrapper[4941]: I0307 07:11:44.278166 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8fkwt"] Mar 07 07:11:44 crc kubenswrapper[4941]: W0307 07:11:44.300196 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ccdd50_0997_4e6e_9e05_3555379221a0.slice/crio-10439a9599cfed15257dfcbf32bc6d497a5c9f11ef5a7d790c825d44d56f77ba WatchSource:0}: Error finding container 10439a9599cfed15257dfcbf32bc6d497a5c9f11ef5a7d790c825d44d56f77ba: Status 404 returned error can't find the container with id 10439a9599cfed15257dfcbf32bc6d497a5c9f11ef5a7d790c825d44d56f77ba Mar 07 07:11:44 crc kubenswrapper[4941]: I0307 07:11:44.563649 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-nbmdr"] Mar 07 07:11:44 crc kubenswrapper[4941]: W0307 07:11:44.571004 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c76297a_be16_4716_9ae8_f7ed93c764b3.slice/crio-483266bc3efea3d0ef6af03db6c956f56f8736876fccb2a276d17c02763f1052 WatchSource:0}: Error finding container 483266bc3efea3d0ef6af03db6c956f56f8736876fccb2a276d17c02763f1052: Status 404 returned error can't find the container with id 483266bc3efea3d0ef6af03db6c956f56f8736876fccb2a276d17c02763f1052 Mar 07 07:11:44 crc kubenswrapper[4941]: I0307 07:11:44.630678 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:44 crc kubenswrapper[4941]: I0307 07:11:44.674222 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.007964 4941 generic.go:334] "Generic (PLEG): container finished" podID="8c76297a-be16-4716-9ae8-f7ed93c764b3" containerID="d1fced01693e014882b87bf747a1d1ae23149cdf0d0d96f5ef7caef64014d0df" exitCode=0 Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.008047 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" event={"ID":"8c76297a-be16-4716-9ae8-f7ed93c764b3","Type":"ContainerDied","Data":"d1fced01693e014882b87bf747a1d1ae23149cdf0d0d96f5ef7caef64014d0df"} Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.008374 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" event={"ID":"8c76297a-be16-4716-9ae8-f7ed93c764b3","Type":"ContainerStarted","Data":"483266bc3efea3d0ef6af03db6c956f56f8736876fccb2a276d17c02763f1052"} Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.010943 4941 generic.go:334] "Generic (PLEG): container finished" podID="03d5d76f-08c8-429f-b661-b51290a50767" containerID="a2a9a98e2f9d5cac4aaecc1748185a6e6083f6637a61a2145743f59b49cba5a6" exitCode=0 Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.011097 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" event={"ID":"03d5d76f-08c8-429f-b661-b51290a50767","Type":"ContainerDied","Data":"a2a9a98e2f9d5cac4aaecc1748185a6e6083f6637a61a2145743f59b49cba5a6"} Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.012599 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" event={"ID":"03d5d76f-08c8-429f-b661-b51290a50767","Type":"ContainerStarted","Data":"47a8931de546003ecfe21e7e46990ca8e0b8ebbf386e03e0dea37a689c48b1db"} Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.026533 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8fkwt" event={"ID":"88ccdd50-0997-4e6e-9e05-3555379221a0","Type":"ContainerStarted","Data":"2dd2f8674fc7368ace30b5ccbaa4c590e6795b851ad19939a0b2b02851b97fd8"} Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.027056 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.027101 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8fkwt" event={"ID":"88ccdd50-0997-4e6e-9e05-3555379221a0","Type":"ContainerStarted","Data":"10439a9599cfed15257dfcbf32bc6d497a5c9f11ef5a7d790c825d44d56f77ba"} Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.076438 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8fkwt" podStartSLOduration=2.076392383 podStartE2EDuration="2.076392383s" podCreationTimestamp="2026-03-07 07:11:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:11:45.06638452 +0000 UTC m=+1202.018749985" watchObservedRunningTime="2026-03-07 07:11:45.076392383 +0000 UTC m=+1202.028757868" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.098016 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.277530 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.279750 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.281731 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.281746 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-jx6g6" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.282029 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.282166 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.300254 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pkpg\" (UniqueName: \"kubernetes.io/projected/d1ad12db-0b25-4e03-8772-de047be41b0d-kube-api-access-4pkpg\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.300331 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ad12db-0b25-4e03-8772-de047be41b0d-config\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.300365 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1ad12db-0b25-4e03-8772-de047be41b0d-scripts\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.300389 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.300447 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.300482 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1ad12db-0b25-4e03-8772-de047be41b0d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.300507 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.308616 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.342854 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.401196 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-dns-svc\") pod \"03d5d76f-08c8-429f-b661-b51290a50767\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.401275 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx5h4\" (UniqueName: \"kubernetes.io/projected/03d5d76f-08c8-429f-b661-b51290a50767-kube-api-access-bx5h4\") pod \"03d5d76f-08c8-429f-b661-b51290a50767\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.401451 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-ovsdbserver-sb\") pod \"03d5d76f-08c8-429f-b661-b51290a50767\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.401520 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-config\") pod \"03d5d76f-08c8-429f-b661-b51290a50767\" (UID: \"03d5d76f-08c8-429f-b661-b51290a50767\") " Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.401856 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.401940 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pkpg\" (UniqueName: \"kubernetes.io/projected/d1ad12db-0b25-4e03-8772-de047be41b0d-kube-api-access-4pkpg\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.402018 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ad12db-0b25-4e03-8772-de047be41b0d-config\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.402053 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1ad12db-0b25-4e03-8772-de047be41b0d-scripts\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.402080 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.402105 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.402139 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1ad12db-0b25-4e03-8772-de047be41b0d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.402735 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1ad12db-0b25-4e03-8772-de047be41b0d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.404160 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1ad12db-0b25-4e03-8772-de047be41b0d-scripts\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.408395 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.409533 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d5d76f-08c8-429f-b661-b51290a50767-kube-api-access-bx5h4" (OuterVolumeSpecName: "kube-api-access-bx5h4") pod "03d5d76f-08c8-429f-b661-b51290a50767" (UID: "03d5d76f-08c8-429f-b661-b51290a50767"). InnerVolumeSpecName "kube-api-access-bx5h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.415844 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.431514 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ad12db-0b25-4e03-8772-de047be41b0d-config\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.433461 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.434830 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-config" (OuterVolumeSpecName: "config") pod "03d5d76f-08c8-429f-b661-b51290a50767" (UID: "03d5d76f-08c8-429f-b661-b51290a50767"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.435863 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "03d5d76f-08c8-429f-b661-b51290a50767" (UID: "03d5d76f-08c8-429f-b661-b51290a50767"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.437805 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "03d5d76f-08c8-429f-b661-b51290a50767" (UID: "03d5d76f-08c8-429f-b661-b51290a50767"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.448255 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pkpg\" (UniqueName: \"kubernetes.io/projected/d1ad12db-0b25-4e03-8772-de047be41b0d-kube-api-access-4pkpg\") pod \"ovn-northd-0\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " pod="openstack/ovn-northd-0" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.503324 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.503354 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.503364 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx5h4\" (UniqueName: \"kubernetes.io/projected/03d5d76f-08c8-429f-b661-b51290a50767-kube-api-access-bx5h4\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.503375 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03d5d76f-08c8-429f-b661-b51290a50767-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:45 crc kubenswrapper[4941]: I0307 07:11:45.606651 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.031951 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.035019 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" event={"ID":"03d5d76f-08c8-429f-b661-b51290a50767","Type":"ContainerDied","Data":"47a8931de546003ecfe21e7e46990ca8e0b8ebbf386e03e0dea37a689c48b1db"} Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.035070 4941 scope.go:117] "RemoveContainer" containerID="a2a9a98e2f9d5cac4aaecc1748185a6e6083f6637a61a2145743f59b49cba5a6" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.035075 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795cf8b45c-mgrsv" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.038550 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" event={"ID":"8c76297a-be16-4716-9ae8-f7ed93c764b3","Type":"ContainerStarted","Data":"3cfb69e8a754fb9a68ec72f0b58e34ab064337380e6bd8d2bf75f364c27a7f5c"} Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.039056 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.054929 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" podStartSLOduration=3.054909487 podStartE2EDuration="3.054909487s" podCreationTimestamp="2026-03-07 07:11:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:11:46.052235742 +0000 UTC m=+1203.004601217" watchObservedRunningTime="2026-03-07 07:11:46.054909487 +0000 UTC m=+1203.007274962" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.113982 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-mgrsv"] Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.120011 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-mgrsv"] Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.501088 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.559345 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-nbmdr"] Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.594802 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-htc9x"] Mar 07 07:11:46 crc kubenswrapper[4941]: E0307 07:11:46.595166 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d5d76f-08c8-429f-b661-b51290a50767" containerName="init" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.595181 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d5d76f-08c8-429f-b661-b51290a50767" containerName="init" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.595332 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d5d76f-08c8-429f-b661-b51290a50767" containerName="init" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.596116 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.603582 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-htc9x"] Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.724327 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.724648 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-config\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.724797 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.724925 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-dns-svc\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.725096 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q74qd\" (UniqueName: \"kubernetes.io/projected/1458c12c-70fd-4cc9-b886-88f99711104f-kube-api-access-q74qd\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.826908 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.827245 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-config\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.827378 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.827536 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-dns-svc\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.827669 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q74qd\" (UniqueName: \"kubernetes.io/projected/1458c12c-70fd-4cc9-b886-88f99711104f-kube-api-access-q74qd\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.827966 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.827967 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-config\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.828227 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.828568 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-dns-svc\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.847085 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q74qd\" (UniqueName: \"kubernetes.io/projected/1458c12c-70fd-4cc9-b886-88f99711104f-kube-api-access-q74qd\") pod \"dnsmasq-dns-675f7dd995-htc9x\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:46 crc kubenswrapper[4941]: I0307 07:11:46.923336 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.059607 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d1ad12db-0b25-4e03-8772-de047be41b0d","Type":"ContainerStarted","Data":"d6a720f08f8f966945eb7d616110127ed462090089cba4b66d921d17f0f339ee"} Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.242112 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.330771 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.390337 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-htc9x"] Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.691922 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.697226 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.698907 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.698911 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.699155 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.699864 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lcbqx" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.720943 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.840859 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6a5f223a-7907-42a5-954b-fafc3c4b78da-lock\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.840903 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.840946 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.840983 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6a5f223a-7907-42a5-954b-fafc3c4b78da-cache\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.841032 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a5f223a-7907-42a5-954b-fafc3c4b78da-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.841057 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4nrl\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-kube-api-access-z4nrl\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.942784 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6a5f223a-7907-42a5-954b-fafc3c4b78da-cache\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.943135 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a5f223a-7907-42a5-954b-fafc3c4b78da-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.943165 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4nrl\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-kube-api-access-z4nrl\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.943192 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6a5f223a-7907-42a5-954b-fafc3c4b78da-lock\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.943214 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.943235 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6a5f223a-7907-42a5-954b-fafc3c4b78da-cache\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.943253 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: E0307 07:11:47.943361 4941 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:11:47 crc kubenswrapper[4941]: E0307 07:11:47.943374 4941 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:11:47 crc kubenswrapper[4941]: E0307 07:11:47.943437 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift podName:6a5f223a-7907-42a5-954b-fafc3c4b78da nodeName:}" failed. No retries permitted until 2026-03-07 07:11:48.443419686 +0000 UTC m=+1205.395785151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift") pod "swift-storage-0" (UID: "6a5f223a-7907-42a5-954b-fafc3c4b78da") : configmap "swift-ring-files" not found Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.943761 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.943856 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6a5f223a-7907-42a5-954b-fafc3c4b78da-lock\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.950885 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a5f223a-7907-42a5-954b-fafc3c4b78da-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.961359 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4nrl\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-kube-api-access-z4nrl\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.967063 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d5d76f-08c8-429f-b661-b51290a50767" path="/var/lib/kubelet/pods/03d5d76f-08c8-429f-b661-b51290a50767/volumes" Mar 07 07:11:47 crc kubenswrapper[4941]: I0307 07:11:47.969743 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.070989 4941 generic.go:334] "Generic (PLEG): container finished" podID="1458c12c-70fd-4cc9-b886-88f99711104f" containerID="9eb14d74e3a81aeea91e4a8d7aee8449a7e5e24a4dec781e91b80b616894f3ea" exitCode=0 Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.071057 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-htc9x" event={"ID":"1458c12c-70fd-4cc9-b886-88f99711104f","Type":"ContainerDied","Data":"9eb14d74e3a81aeea91e4a8d7aee8449a7e5e24a4dec781e91b80b616894f3ea"} Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.071121 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-htc9x" event={"ID":"1458c12c-70fd-4cc9-b886-88f99711104f","Type":"ContainerStarted","Data":"91ae9e6ac2d97c6d36a4671564bc8e0275de1f13f9005b91481b06bcd9d63cb2"} Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.074163 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d1ad12db-0b25-4e03-8772-de047be41b0d","Type":"ContainerStarted","Data":"98b33015c03bac854741c1b04cb6494dce5e0047189adcafb6d92d466315ec75"} Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.074229 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d1ad12db-0b25-4e03-8772-de047be41b0d","Type":"ContainerStarted","Data":"f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e"} Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.075125 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" podUID="8c76297a-be16-4716-9ae8-f7ed93c764b3" containerName="dnsmasq-dns" containerID="cri-o://3cfb69e8a754fb9a68ec72f0b58e34ab064337380e6bd8d2bf75f364c27a7f5c" gracePeriod=10 Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.137246 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.5501383629999999 podStartE2EDuration="3.13722212s" podCreationTimestamp="2026-03-07 07:11:45 +0000 UTC" firstStartedPulling="2026-03-07 07:11:46.032539912 +0000 UTC m=+1202.984905427" lastFinishedPulling="2026-03-07 07:11:47.619623719 +0000 UTC m=+1204.571989184" observedRunningTime="2026-03-07 07:11:48.109841824 +0000 UTC m=+1205.062207289" watchObservedRunningTime="2026-03-07 07:11:48.13722212 +0000 UTC m=+1205.089587605" Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.451616 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:48 crc kubenswrapper[4941]: E0307 07:11:48.451781 4941 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:11:48 crc kubenswrapper[4941]: E0307 07:11:48.452064 4941 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:11:48 crc kubenswrapper[4941]: E0307 07:11:48.452137 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift podName:6a5f223a-7907-42a5-954b-fafc3c4b78da nodeName:}" failed. No retries permitted until 2026-03-07 07:11:49.45211475 +0000 UTC m=+1206.404480235 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift") pod "swift-storage-0" (UID: "6a5f223a-7907-42a5-954b-fafc3c4b78da") : configmap "swift-ring-files" not found Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.461293 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.552853 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm6rs\" (UniqueName: \"kubernetes.io/projected/8c76297a-be16-4716-9ae8-f7ed93c764b3-kube-api-access-rm6rs\") pod \"8c76297a-be16-4716-9ae8-f7ed93c764b3\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.552970 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-ovsdbserver-sb\") pod \"8c76297a-be16-4716-9ae8-f7ed93c764b3\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.552999 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-ovsdbserver-nb\") pod \"8c76297a-be16-4716-9ae8-f7ed93c764b3\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.553104 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-config\") pod \"8c76297a-be16-4716-9ae8-f7ed93c764b3\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.553132 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-dns-svc\") pod \"8c76297a-be16-4716-9ae8-f7ed93c764b3\" (UID: \"8c76297a-be16-4716-9ae8-f7ed93c764b3\") " Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.558245 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c76297a-be16-4716-9ae8-f7ed93c764b3-kube-api-access-rm6rs" (OuterVolumeSpecName: "kube-api-access-rm6rs") pod "8c76297a-be16-4716-9ae8-f7ed93c764b3" (UID: "8c76297a-be16-4716-9ae8-f7ed93c764b3"). InnerVolumeSpecName "kube-api-access-rm6rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.597697 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8c76297a-be16-4716-9ae8-f7ed93c764b3" (UID: "8c76297a-be16-4716-9ae8-f7ed93c764b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.600027 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-config" (OuterVolumeSpecName: "config") pod "8c76297a-be16-4716-9ae8-f7ed93c764b3" (UID: "8c76297a-be16-4716-9ae8-f7ed93c764b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.606948 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8c76297a-be16-4716-9ae8-f7ed93c764b3" (UID: "8c76297a-be16-4716-9ae8-f7ed93c764b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.609672 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8c76297a-be16-4716-9ae8-f7ed93c764b3" (UID: "8c76297a-be16-4716-9ae8-f7ed93c764b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.656221 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.656264 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.656304 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.656332 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm6rs\" (UniqueName: \"kubernetes.io/projected/8c76297a-be16-4716-9ae8-f7ed93c764b3-kube-api-access-rm6rs\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:48 crc kubenswrapper[4941]: I0307 07:11:48.656346 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c76297a-be16-4716-9ae8-f7ed93c764b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.086103 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-htc9x" event={"ID":"1458c12c-70fd-4cc9-b886-88f99711104f","Type":"ContainerStarted","Data":"76c58b9a930b18c64f775408126e8cec528882fdb608a784350559dcc0d3e016"} Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.086434 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.088826 4941 generic.go:334] "Generic (PLEG): container finished" podID="8c76297a-be16-4716-9ae8-f7ed93c764b3" containerID="3cfb69e8a754fb9a68ec72f0b58e34ab064337380e6bd8d2bf75f364c27a7f5c" exitCode=0 Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.090459 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.093687 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" event={"ID":"8c76297a-be16-4716-9ae8-f7ed93c764b3","Type":"ContainerDied","Data":"3cfb69e8a754fb9a68ec72f0b58e34ab064337380e6bd8d2bf75f364c27a7f5c"} Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.104589 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-nbmdr" event={"ID":"8c76297a-be16-4716-9ae8-f7ed93c764b3","Type":"ContainerDied","Data":"483266bc3efea3d0ef6af03db6c956f56f8736876fccb2a276d17c02763f1052"} Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.104670 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.104700 4941 scope.go:117] "RemoveContainer" containerID="3cfb69e8a754fb9a68ec72f0b58e34ab064337380e6bd8d2bf75f364c27a7f5c" Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.137577 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f7dd995-htc9x" podStartSLOduration=3.137556695 podStartE2EDuration="3.137556695s" podCreationTimestamp="2026-03-07 07:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:11:49.122475588 +0000 UTC m=+1206.074841053" watchObservedRunningTime="2026-03-07 07:11:49.137556695 +0000 UTC m=+1206.089922160" Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.143279 4941 scope.go:117] "RemoveContainer" containerID="d1fced01693e014882b87bf747a1d1ae23149cdf0d0d96f5ef7caef64014d0df" Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.171084 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-nbmdr"] Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.178651 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-nbmdr"] Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.186166 4941 scope.go:117] "RemoveContainer" containerID="3cfb69e8a754fb9a68ec72f0b58e34ab064337380e6bd8d2bf75f364c27a7f5c" Mar 07 07:11:49 crc kubenswrapper[4941]: E0307 07:11:49.189141 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cfb69e8a754fb9a68ec72f0b58e34ab064337380e6bd8d2bf75f364c27a7f5c\": container with ID starting with 3cfb69e8a754fb9a68ec72f0b58e34ab064337380e6bd8d2bf75f364c27a7f5c not found: ID does not exist" containerID="3cfb69e8a754fb9a68ec72f0b58e34ab064337380e6bd8d2bf75f364c27a7f5c" Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.189183 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cfb69e8a754fb9a68ec72f0b58e34ab064337380e6bd8d2bf75f364c27a7f5c"} err="failed to get container status \"3cfb69e8a754fb9a68ec72f0b58e34ab064337380e6bd8d2bf75f364c27a7f5c\": rpc error: code = NotFound desc = could not find container \"3cfb69e8a754fb9a68ec72f0b58e34ab064337380e6bd8d2bf75f364c27a7f5c\": container with ID starting with 3cfb69e8a754fb9a68ec72f0b58e34ab064337380e6bd8d2bf75f364c27a7f5c not found: ID does not exist" Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.189211 4941 scope.go:117] "RemoveContainer" containerID="d1fced01693e014882b87bf747a1d1ae23149cdf0d0d96f5ef7caef64014d0df" Mar 07 07:11:49 crc kubenswrapper[4941]: E0307 07:11:49.189915 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1fced01693e014882b87bf747a1d1ae23149cdf0d0d96f5ef7caef64014d0df\": container with ID starting with d1fced01693e014882b87bf747a1d1ae23149cdf0d0d96f5ef7caef64014d0df not found: ID does not exist" containerID="d1fced01693e014882b87bf747a1d1ae23149cdf0d0d96f5ef7caef64014d0df" Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.189946 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1fced01693e014882b87bf747a1d1ae23149cdf0d0d96f5ef7caef64014d0df"} err="failed to get container status \"d1fced01693e014882b87bf747a1d1ae23149cdf0d0d96f5ef7caef64014d0df\": rpc error: code = NotFound desc = could not find container \"d1fced01693e014882b87bf747a1d1ae23149cdf0d0d96f5ef7caef64014d0df\": container with ID starting with d1fced01693e014882b87bf747a1d1ae23149cdf0d0d96f5ef7caef64014d0df not found: ID does not exist" Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.473301 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:49 crc kubenswrapper[4941]: E0307 07:11:49.473504 4941 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:11:49 crc kubenswrapper[4941]: E0307 07:11:49.473522 4941 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:11:49 crc kubenswrapper[4941]: E0307 07:11:49.473581 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift podName:6a5f223a-7907-42a5-954b-fafc3c4b78da nodeName:}" failed. No retries permitted until 2026-03-07 07:11:51.473558828 +0000 UTC m=+1208.425924293 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift") pod "swift-storage-0" (UID: "6a5f223a-7907-42a5-954b-fafc3c4b78da") : configmap "swift-ring-files" not found Mar 07 07:11:49 crc kubenswrapper[4941]: I0307 07:11:49.967616 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c76297a-be16-4716-9ae8-f7ed93c764b3" path="/var/lib/kubelet/pods/8c76297a-be16-4716-9ae8-f7ed93c764b3/volumes" Mar 07 07:11:50 crc kubenswrapper[4941]: I0307 07:11:50.840376 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 07 07:11:50 crc kubenswrapper[4941]: I0307 07:11:50.907911 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.478973 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-46v2p"] Mar 07 07:11:51 crc kubenswrapper[4941]: E0307 07:11:51.480339 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c76297a-be16-4716-9ae8-f7ed93c764b3" containerName="dnsmasq-dns" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.480372 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c76297a-be16-4716-9ae8-f7ed93c764b3" containerName="dnsmasq-dns" Mar 07 07:11:51 crc kubenswrapper[4941]: E0307 07:11:51.480453 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c76297a-be16-4716-9ae8-f7ed93c764b3" containerName="init" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.480467 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c76297a-be16-4716-9ae8-f7ed93c764b3" containerName="init" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.481034 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c76297a-be16-4716-9ae8-f7ed93c764b3" containerName="dnsmasq-dns" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.482236 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-46v2p" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.490324 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.493253 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-46v2p"] Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.507088 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:51 crc kubenswrapper[4941]: E0307 07:11:51.507264 4941 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:11:51 crc kubenswrapper[4941]: E0307 07:11:51.507277 4941 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:11:51 crc kubenswrapper[4941]: E0307 07:11:51.507314 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift podName:6a5f223a-7907-42a5-954b-fafc3c4b78da nodeName:}" failed. No retries permitted until 2026-03-07 07:11:55.50730108 +0000 UTC m=+1212.459666545 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift") pod "swift-storage-0" (UID: "6a5f223a-7907-42a5-954b-fafc3c4b78da") : configmap "swift-ring-files" not found Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.608646 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x9gl\" (UniqueName: \"kubernetes.io/projected/ba28e625-ebc4-4dc0-a59b-539fa676ce8f-kube-api-access-4x9gl\") pod \"root-account-create-update-46v2p\" (UID: \"ba28e625-ebc4-4dc0-a59b-539fa676ce8f\") " pod="openstack/root-account-create-update-46v2p" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.608713 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba28e625-ebc4-4dc0-a59b-539fa676ce8f-operator-scripts\") pod \"root-account-create-update-46v2p\" (UID: \"ba28e625-ebc4-4dc0-a59b-539fa676ce8f\") " pod="openstack/root-account-create-update-46v2p" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.647592 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-ktvg9"] Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.648849 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.652417 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.652790 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.652810 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.659728 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-ktvg9"] Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.693213 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-m9v9g"] Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.694551 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.699863 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-ktvg9"] Mar 07 07:11:51 crc kubenswrapper[4941]: E0307 07:11:51.700582 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-944bp ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-ktvg9" podUID="8986a49b-0d0d-4f00-b296-870f12fa237d" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.709723 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-944bp\" (UniqueName: \"kubernetes.io/projected/8986a49b-0d0d-4f00-b296-870f12fa237d-kube-api-access-944bp\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.709770 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8986a49b-0d0d-4f00-b296-870f12fa237d-scripts\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.709817 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-dispersionconf\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.709845 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-swiftconf\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.709861 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-combined-ca-bundle\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.709877 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8986a49b-0d0d-4f00-b296-870f12fa237d-etc-swift\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.709908 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x9gl\" (UniqueName: \"kubernetes.io/projected/ba28e625-ebc4-4dc0-a59b-539fa676ce8f-kube-api-access-4x9gl\") pod \"root-account-create-update-46v2p\" (UID: \"ba28e625-ebc4-4dc0-a59b-539fa676ce8f\") " pod="openstack/root-account-create-update-46v2p" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.709933 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba28e625-ebc4-4dc0-a59b-539fa676ce8f-operator-scripts\") pod \"root-account-create-update-46v2p\" (UID: \"ba28e625-ebc4-4dc0-a59b-539fa676ce8f\") " pod="openstack/root-account-create-update-46v2p" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.709954 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8986a49b-0d0d-4f00-b296-870f12fa237d-ring-data-devices\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.711909 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba28e625-ebc4-4dc0-a59b-539fa676ce8f-operator-scripts\") pod \"root-account-create-update-46v2p\" (UID: \"ba28e625-ebc4-4dc0-a59b-539fa676ce8f\") " pod="openstack/root-account-create-update-46v2p" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.715516 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m9v9g"] Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.738092 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x9gl\" (UniqueName: \"kubernetes.io/projected/ba28e625-ebc4-4dc0-a59b-539fa676ce8f-kube-api-access-4x9gl\") pod \"root-account-create-update-46v2p\" (UID: \"ba28e625-ebc4-4dc0-a59b-539fa676ce8f\") " pod="openstack/root-account-create-update-46v2p" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.811203 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8986a49b-0d0d-4f00-b296-870f12fa237d-scripts\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.811275 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04152996-2000-4188-840c-1759d193c903-scripts\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.811308 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-combined-ca-bundle\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.811339 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/04152996-2000-4188-840c-1759d193c903-ring-data-devices\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.811391 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-dispersionconf\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.811434 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-swiftconf\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.811785 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-swiftconf\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.811855 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-combined-ca-bundle\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.811882 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/04152996-2000-4188-840c-1759d193c903-etc-swift\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.811912 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8986a49b-0d0d-4f00-b296-870f12fa237d-etc-swift\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.811983 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-dispersionconf\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.812046 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8986a49b-0d0d-4f00-b296-870f12fa237d-ring-data-devices\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.812164 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtbqt\" (UniqueName: \"kubernetes.io/projected/04152996-2000-4188-840c-1759d193c903-kube-api-access-dtbqt\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.812245 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-944bp\" (UniqueName: \"kubernetes.io/projected/8986a49b-0d0d-4f00-b296-870f12fa237d-kube-api-access-944bp\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.812565 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8986a49b-0d0d-4f00-b296-870f12fa237d-scripts\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.812590 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-46v2p" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.813515 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8986a49b-0d0d-4f00-b296-870f12fa237d-ring-data-devices\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.813910 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8986a49b-0d0d-4f00-b296-870f12fa237d-etc-swift\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.816747 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-dispersionconf\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.816805 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-combined-ca-bundle\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.817003 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-swiftconf\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.831349 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-944bp\" (UniqueName: \"kubernetes.io/projected/8986a49b-0d0d-4f00-b296-870f12fa237d-kube-api-access-944bp\") pod \"swift-ring-rebalance-ktvg9\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.913963 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/04152996-2000-4188-840c-1759d193c903-etc-swift\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.914223 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-dispersionconf\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.914283 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtbqt\" (UniqueName: \"kubernetes.io/projected/04152996-2000-4188-840c-1759d193c903-kube-api-access-dtbqt\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.914328 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04152996-2000-4188-840c-1759d193c903-scripts\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.914349 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-combined-ca-bundle\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.914371 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/04152996-2000-4188-840c-1759d193c903-ring-data-devices\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.914415 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-swiftconf\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.914828 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/04152996-2000-4188-840c-1759d193c903-etc-swift\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.915429 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04152996-2000-4188-840c-1759d193c903-scripts\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.915775 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/04152996-2000-4188-840c-1759d193c903-ring-data-devices\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.919428 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-dispersionconf\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.920981 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-swiftconf\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.922244 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-combined-ca-bundle\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:51 crc kubenswrapper[4941]: I0307 07:11:51.930009 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtbqt\" (UniqueName: \"kubernetes.io/projected/04152996-2000-4188-840c-1759d193c903-kube-api-access-dtbqt\") pod \"swift-ring-rebalance-m9v9g\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.012755 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.111296 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.123652 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.222942 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-combined-ca-bundle\") pod \"8986a49b-0d0d-4f00-b296-870f12fa237d\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.223024 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8986a49b-0d0d-4f00-b296-870f12fa237d-etc-swift\") pod \"8986a49b-0d0d-4f00-b296-870f12fa237d\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.223064 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-dispersionconf\") pod \"8986a49b-0d0d-4f00-b296-870f12fa237d\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.223111 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8986a49b-0d0d-4f00-b296-870f12fa237d-scripts\") pod \"8986a49b-0d0d-4f00-b296-870f12fa237d\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.223274 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-swiftconf\") pod \"8986a49b-0d0d-4f00-b296-870f12fa237d\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.223325 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8986a49b-0d0d-4f00-b296-870f12fa237d-ring-data-devices\") pod \"8986a49b-0d0d-4f00-b296-870f12fa237d\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.223391 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-944bp\" (UniqueName: \"kubernetes.io/projected/8986a49b-0d0d-4f00-b296-870f12fa237d-kube-api-access-944bp\") pod \"8986a49b-0d0d-4f00-b296-870f12fa237d\" (UID: \"8986a49b-0d0d-4f00-b296-870f12fa237d\") " Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.223491 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8986a49b-0d0d-4f00-b296-870f12fa237d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8986a49b-0d0d-4f00-b296-870f12fa237d" (UID: "8986a49b-0d0d-4f00-b296-870f12fa237d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.223834 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8986a49b-0d0d-4f00-b296-870f12fa237d-scripts" (OuterVolumeSpecName: "scripts") pod "8986a49b-0d0d-4f00-b296-870f12fa237d" (UID: "8986a49b-0d0d-4f00-b296-870f12fa237d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.223868 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8986a49b-0d0d-4f00-b296-870f12fa237d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8986a49b-0d0d-4f00-b296-870f12fa237d" (UID: "8986a49b-0d0d-4f00-b296-870f12fa237d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.224048 4941 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8986a49b-0d0d-4f00-b296-870f12fa237d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.224063 4941 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8986a49b-0d0d-4f00-b296-870f12fa237d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.224073 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8986a49b-0d0d-4f00-b296-870f12fa237d-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.226646 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8986a49b-0d0d-4f00-b296-870f12fa237d-kube-api-access-944bp" (OuterVolumeSpecName: "kube-api-access-944bp") pod "8986a49b-0d0d-4f00-b296-870f12fa237d" (UID: "8986a49b-0d0d-4f00-b296-870f12fa237d"). InnerVolumeSpecName "kube-api-access-944bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.227178 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8986a49b-0d0d-4f00-b296-870f12fa237d" (UID: "8986a49b-0d0d-4f00-b296-870f12fa237d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.227694 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8986a49b-0d0d-4f00-b296-870f12fa237d" (UID: "8986a49b-0d0d-4f00-b296-870f12fa237d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.228565 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8986a49b-0d0d-4f00-b296-870f12fa237d" (UID: "8986a49b-0d0d-4f00-b296-870f12fa237d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:11:52 crc kubenswrapper[4941]: W0307 07:11:52.304364 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba28e625_ebc4_4dc0_a59b_539fa676ce8f.slice/crio-f95d8f511c267cbc6d68404a88c63fba2564cf3ec184dd5552784b8d3331810f WatchSource:0}: Error finding container f95d8f511c267cbc6d68404a88c63fba2564cf3ec184dd5552784b8d3331810f: Status 404 returned error can't find the container with id f95d8f511c267cbc6d68404a88c63fba2564cf3ec184dd5552784b8d3331810f Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.310895 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-46v2p"] Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.325700 4941 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.325746 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-944bp\" (UniqueName: \"kubernetes.io/projected/8986a49b-0d0d-4f00-b296-870f12fa237d-kube-api-access-944bp\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.325762 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.325779 4941 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8986a49b-0d0d-4f00-b296-870f12fa237d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:52 crc kubenswrapper[4941]: I0307 07:11:52.411262 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m9v9g"] Mar 07 07:11:52 crc kubenswrapper[4941]: W0307 07:11:52.411535 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04152996_2000_4188_840c_1759d193c903.slice/crio-be4b09a954a1b9dbac4461b1545b18cf1d5d6f14d6793fdedc05fae9c67bfd36 WatchSource:0}: Error finding container be4b09a954a1b9dbac4461b1545b18cf1d5d6f14d6793fdedc05fae9c67bfd36: Status 404 returned error can't find the container with id be4b09a954a1b9dbac4461b1545b18cf1d5d6f14d6793fdedc05fae9c67bfd36 Mar 07 07:11:53 crc kubenswrapper[4941]: I0307 07:11:53.118424 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m9v9g" event={"ID":"04152996-2000-4188-840c-1759d193c903","Type":"ContainerStarted","Data":"be4b09a954a1b9dbac4461b1545b18cf1d5d6f14d6793fdedc05fae9c67bfd36"} Mar 07 07:11:53 crc kubenswrapper[4941]: I0307 07:11:53.119952 4941 generic.go:334] "Generic (PLEG): container finished" podID="ba28e625-ebc4-4dc0-a59b-539fa676ce8f" containerID="91f78e3e58dc63c31d24b89ad1468484bedb657015f4630503f7c2e482731d6d" exitCode=0 Mar 07 07:11:53 crc kubenswrapper[4941]: I0307 07:11:53.120004 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ktvg9" Mar 07 07:11:53 crc kubenswrapper[4941]: I0307 07:11:53.128430 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-46v2p" event={"ID":"ba28e625-ebc4-4dc0-a59b-539fa676ce8f","Type":"ContainerDied","Data":"91f78e3e58dc63c31d24b89ad1468484bedb657015f4630503f7c2e482731d6d"} Mar 07 07:11:53 crc kubenswrapper[4941]: I0307 07:11:53.128462 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-46v2p" event={"ID":"ba28e625-ebc4-4dc0-a59b-539fa676ce8f","Type":"ContainerStarted","Data":"f95d8f511c267cbc6d68404a88c63fba2564cf3ec184dd5552784b8d3331810f"} Mar 07 07:11:53 crc kubenswrapper[4941]: I0307 07:11:53.187827 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-ktvg9"] Mar 07 07:11:53 crc kubenswrapper[4941]: I0307 07:11:53.194890 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-ktvg9"] Mar 07 07:11:53 crc kubenswrapper[4941]: I0307 07:11:53.965607 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8986a49b-0d0d-4f00-b296-870f12fa237d" path="/var/lib/kubelet/pods/8986a49b-0d0d-4f00-b296-870f12fa237d/volumes" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.335145 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-46v2p" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.426261 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x9gl\" (UniqueName: \"kubernetes.io/projected/ba28e625-ebc4-4dc0-a59b-539fa676ce8f-kube-api-access-4x9gl\") pod \"ba28e625-ebc4-4dc0-a59b-539fa676ce8f\" (UID: \"ba28e625-ebc4-4dc0-a59b-539fa676ce8f\") " Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.426474 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba28e625-ebc4-4dc0-a59b-539fa676ce8f-operator-scripts\") pod \"ba28e625-ebc4-4dc0-a59b-539fa676ce8f\" (UID: \"ba28e625-ebc4-4dc0-a59b-539fa676ce8f\") " Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.427298 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba28e625-ebc4-4dc0-a59b-539fa676ce8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba28e625-ebc4-4dc0-a59b-539fa676ce8f" (UID: "ba28e625-ebc4-4dc0-a59b-539fa676ce8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.427906 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba28e625-ebc4-4dc0-a59b-539fa676ce8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.433042 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba28e625-ebc4-4dc0-a59b-539fa676ce8f-kube-api-access-4x9gl" (OuterVolumeSpecName: "kube-api-access-4x9gl") pod "ba28e625-ebc4-4dc0-a59b-539fa676ce8f" (UID: "ba28e625-ebc4-4dc0-a59b-539fa676ce8f"). InnerVolumeSpecName "kube-api-access-4x9gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.481151 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-5ft6f"] Mar 07 07:11:55 crc kubenswrapper[4941]: E0307 07:11:55.481602 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba28e625-ebc4-4dc0-a59b-539fa676ce8f" containerName="mariadb-account-create-update" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.481620 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba28e625-ebc4-4dc0-a59b-539fa676ce8f" containerName="mariadb-account-create-update" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.481804 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba28e625-ebc4-4dc0-a59b-539fa676ce8f" containerName="mariadb-account-create-update" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.482327 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5ft6f" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.511179 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5ft6f"] Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.520511 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-582b-account-create-update-c9h8k"] Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.521829 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-582b-account-create-update-c9h8k" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.523179 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.528171 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-582b-account-create-update-c9h8k"] Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.529346 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.529489 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x9gl\" (UniqueName: \"kubernetes.io/projected/ba28e625-ebc4-4dc0-a59b-539fa676ce8f-kube-api-access-4x9gl\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:55 crc kubenswrapper[4941]: E0307 07:11:55.529648 4941 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:11:55 crc kubenswrapper[4941]: E0307 07:11:55.529734 4941 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:11:55 crc kubenswrapper[4941]: E0307 07:11:55.529845 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift podName:6a5f223a-7907-42a5-954b-fafc3c4b78da nodeName:}" failed. No retries permitted until 2026-03-07 07:12:03.529828792 +0000 UTC m=+1220.482194257 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift") pod "swift-storage-0" (UID: "6a5f223a-7907-42a5-954b-fafc3c4b78da") : configmap "swift-ring-files" not found Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.616058 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-pfpsm"] Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.617350 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pfpsm" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.627291 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pfpsm"] Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.630729 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv6xv\" (UniqueName: \"kubernetes.io/projected/38422d86-9fa3-4547-a810-106f783ac38a-kube-api-access-gv6xv\") pod \"keystone-582b-account-create-update-c9h8k\" (UID: \"38422d86-9fa3-4547-a810-106f783ac38a\") " pod="openstack/keystone-582b-account-create-update-c9h8k" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.630826 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38422d86-9fa3-4547-a810-106f783ac38a-operator-scripts\") pod \"keystone-582b-account-create-update-c9h8k\" (UID: \"38422d86-9fa3-4547-a810-106f783ac38a\") " pod="openstack/keystone-582b-account-create-update-c9h8k" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.630876 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0491b032-0a65-4d6e-904e-b464a0acfcda-operator-scripts\") pod \"keystone-db-create-5ft6f\" (UID: \"0491b032-0a65-4d6e-904e-b464a0acfcda\") " pod="openstack/keystone-db-create-5ft6f" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.630923 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlj4s\" (UniqueName: \"kubernetes.io/projected/0491b032-0a65-4d6e-904e-b464a0acfcda-kube-api-access-nlj4s\") pod \"keystone-db-create-5ft6f\" (UID: \"0491b032-0a65-4d6e-904e-b464a0acfcda\") " pod="openstack/keystone-db-create-5ft6f" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.688944 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-346c-account-create-update-94z8w"] Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.690184 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-346c-account-create-update-94z8w" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.691969 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.706330 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-346c-account-create-update-94z8w"] Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.733215 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38422d86-9fa3-4547-a810-106f783ac38a-operator-scripts\") pod \"keystone-582b-account-create-update-c9h8k\" (UID: \"38422d86-9fa3-4547-a810-106f783ac38a\") " pod="openstack/keystone-582b-account-create-update-c9h8k" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.733297 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72be4758-3939-4551-89be-4927ddb81638-operator-scripts\") pod \"placement-db-create-pfpsm\" (UID: \"72be4758-3939-4551-89be-4927ddb81638\") " pod="openstack/placement-db-create-pfpsm" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.733379 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0491b032-0a65-4d6e-904e-b464a0acfcda-operator-scripts\") pod \"keystone-db-create-5ft6f\" (UID: \"0491b032-0a65-4d6e-904e-b464a0acfcda\") " pod="openstack/keystone-db-create-5ft6f" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.733504 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlj4s\" (UniqueName: \"kubernetes.io/projected/0491b032-0a65-4d6e-904e-b464a0acfcda-kube-api-access-nlj4s\") pod \"keystone-db-create-5ft6f\" (UID: \"0491b032-0a65-4d6e-904e-b464a0acfcda\") " pod="openstack/keystone-db-create-5ft6f" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.733578 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45dh8\" (UniqueName: \"kubernetes.io/projected/72be4758-3939-4551-89be-4927ddb81638-kube-api-access-45dh8\") pod \"placement-db-create-pfpsm\" (UID: \"72be4758-3939-4551-89be-4927ddb81638\") " pod="openstack/placement-db-create-pfpsm" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.733638 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv6xv\" (UniqueName: \"kubernetes.io/projected/38422d86-9fa3-4547-a810-106f783ac38a-kube-api-access-gv6xv\") pod \"keystone-582b-account-create-update-c9h8k\" (UID: \"38422d86-9fa3-4547-a810-106f783ac38a\") " pod="openstack/keystone-582b-account-create-update-c9h8k" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.735060 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38422d86-9fa3-4547-a810-106f783ac38a-operator-scripts\") pod \"keystone-582b-account-create-update-c9h8k\" (UID: \"38422d86-9fa3-4547-a810-106f783ac38a\") " pod="openstack/keystone-582b-account-create-update-c9h8k" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.735978 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0491b032-0a65-4d6e-904e-b464a0acfcda-operator-scripts\") pod \"keystone-db-create-5ft6f\" (UID: \"0491b032-0a65-4d6e-904e-b464a0acfcda\") " pod="openstack/keystone-db-create-5ft6f" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.757040 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlj4s\" (UniqueName: \"kubernetes.io/projected/0491b032-0a65-4d6e-904e-b464a0acfcda-kube-api-access-nlj4s\") pod \"keystone-db-create-5ft6f\" (UID: \"0491b032-0a65-4d6e-904e-b464a0acfcda\") " pod="openstack/keystone-db-create-5ft6f" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.757291 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv6xv\" (UniqueName: \"kubernetes.io/projected/38422d86-9fa3-4547-a810-106f783ac38a-kube-api-access-gv6xv\") pod \"keystone-582b-account-create-update-c9h8k\" (UID: \"38422d86-9fa3-4547-a810-106f783ac38a\") " pod="openstack/keystone-582b-account-create-update-c9h8k" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.807381 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5ft6f" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.834594 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc484b5-13a7-48df-a417-3f04600f9320-operator-scripts\") pod \"placement-346c-account-create-update-94z8w\" (UID: \"2dc484b5-13a7-48df-a417-3f04600f9320\") " pod="openstack/placement-346c-account-create-update-94z8w" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.834677 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72be4758-3939-4551-89be-4927ddb81638-operator-scripts\") pod \"placement-db-create-pfpsm\" (UID: \"72be4758-3939-4551-89be-4927ddb81638\") " pod="openstack/placement-db-create-pfpsm" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.834710 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnxdb\" (UniqueName: \"kubernetes.io/projected/2dc484b5-13a7-48df-a417-3f04600f9320-kube-api-access-tnxdb\") pod \"placement-346c-account-create-update-94z8w\" (UID: \"2dc484b5-13a7-48df-a417-3f04600f9320\") " pod="openstack/placement-346c-account-create-update-94z8w" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.834788 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45dh8\" (UniqueName: \"kubernetes.io/projected/72be4758-3939-4551-89be-4927ddb81638-kube-api-access-45dh8\") pod \"placement-db-create-pfpsm\" (UID: \"72be4758-3939-4551-89be-4927ddb81638\") " pod="openstack/placement-db-create-pfpsm" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.835973 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72be4758-3939-4551-89be-4927ddb81638-operator-scripts\") pod \"placement-db-create-pfpsm\" (UID: \"72be4758-3939-4551-89be-4927ddb81638\") " pod="openstack/placement-db-create-pfpsm" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.838626 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-582b-account-create-update-c9h8k" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.852769 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45dh8\" (UniqueName: \"kubernetes.io/projected/72be4758-3939-4551-89be-4927ddb81638-kube-api-access-45dh8\") pod \"placement-db-create-pfpsm\" (UID: \"72be4758-3939-4551-89be-4927ddb81638\") " pod="openstack/placement-db-create-pfpsm" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.934896 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pfpsm" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.938760 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc484b5-13a7-48df-a417-3f04600f9320-operator-scripts\") pod \"placement-346c-account-create-update-94z8w\" (UID: \"2dc484b5-13a7-48df-a417-3f04600f9320\") " pod="openstack/placement-346c-account-create-update-94z8w" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.939009 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnxdb\" (UniqueName: \"kubernetes.io/projected/2dc484b5-13a7-48df-a417-3f04600f9320-kube-api-access-tnxdb\") pod \"placement-346c-account-create-update-94z8w\" (UID: \"2dc484b5-13a7-48df-a417-3f04600f9320\") " pod="openstack/placement-346c-account-create-update-94z8w" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.939668 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc484b5-13a7-48df-a417-3f04600f9320-operator-scripts\") pod \"placement-346c-account-create-update-94z8w\" (UID: \"2dc484b5-13a7-48df-a417-3f04600f9320\") " pod="openstack/placement-346c-account-create-update-94z8w" Mar 07 07:11:55 crc kubenswrapper[4941]: I0307 07:11:55.959916 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnxdb\" (UniqueName: \"kubernetes.io/projected/2dc484b5-13a7-48df-a417-3f04600f9320-kube-api-access-tnxdb\") pod \"placement-346c-account-create-update-94z8w\" (UID: \"2dc484b5-13a7-48df-a417-3f04600f9320\") " pod="openstack/placement-346c-account-create-update-94z8w" Mar 07 07:11:56 crc kubenswrapper[4941]: I0307 07:11:56.003658 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-346c-account-create-update-94z8w" Mar 07 07:11:56 crc kubenswrapper[4941]: I0307 07:11:56.167139 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-46v2p" event={"ID":"ba28e625-ebc4-4dc0-a59b-539fa676ce8f","Type":"ContainerDied","Data":"f95d8f511c267cbc6d68404a88c63fba2564cf3ec184dd5552784b8d3331810f"} Mar 07 07:11:56 crc kubenswrapper[4941]: I0307 07:11:56.167180 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f95d8f511c267cbc6d68404a88c63fba2564cf3ec184dd5552784b8d3331810f" Mar 07 07:11:56 crc kubenswrapper[4941]: I0307 07:11:56.167242 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-46v2p" Mar 07 07:11:56 crc kubenswrapper[4941]: I0307 07:11:56.176029 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m9v9g" event={"ID":"04152996-2000-4188-840c-1759d193c903","Type":"ContainerStarted","Data":"8fc805bc99c0c8af89d3b1cb58369ff1e706429e415c177a6ab724b2d108401f"} Mar 07 07:11:56 crc kubenswrapper[4941]: I0307 07:11:56.198055 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-m9v9g" podStartSLOduration=2.414062155 podStartE2EDuration="5.198038797s" podCreationTimestamp="2026-03-07 07:11:51 +0000 UTC" firstStartedPulling="2026-03-07 07:11:52.414659493 +0000 UTC m=+1209.367024958" lastFinishedPulling="2026-03-07 07:11:55.198636135 +0000 UTC m=+1212.151001600" observedRunningTime="2026-03-07 07:11:56.195653669 +0000 UTC m=+1213.148019124" watchObservedRunningTime="2026-03-07 07:11:56.198038797 +0000 UTC m=+1213.150404262" Mar 07 07:11:56 crc kubenswrapper[4941]: I0307 07:11:56.274690 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5ft6f"] Mar 07 07:11:56 crc kubenswrapper[4941]: I0307 07:11:56.403219 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-582b-account-create-update-c9h8k"] Mar 07 07:11:56 crc kubenswrapper[4941]: W0307 07:11:56.418680 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38422d86_9fa3_4547_a810_106f783ac38a.slice/crio-ab706146b0560183c22387e5667b78ee83746d6bd779efc8f5da481eec9b6af5 WatchSource:0}: Error finding container ab706146b0560183c22387e5667b78ee83746d6bd779efc8f5da481eec9b6af5: Status 404 returned error can't find the container with id ab706146b0560183c22387e5667b78ee83746d6bd779efc8f5da481eec9b6af5 Mar 07 07:11:56 crc kubenswrapper[4941]: I0307 07:11:56.552313 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pfpsm"] Mar 07 07:11:56 crc kubenswrapper[4941]: W0307 07:11:56.552481 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72be4758_3939_4551_89be_4927ddb81638.slice/crio-cb1b08d295983037d8476184459d818abe150e5e955b843b04eef27ecc4e5545 WatchSource:0}: Error finding container cb1b08d295983037d8476184459d818abe150e5e955b843b04eef27ecc4e5545: Status 404 returned error can't find the container with id cb1b08d295983037d8476184459d818abe150e5e955b843b04eef27ecc4e5545 Mar 07 07:11:56 crc kubenswrapper[4941]: W0307 07:11:56.588970 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dc484b5_13a7_48df_a417_3f04600f9320.slice/crio-e6730668e323bea0e505a196f5ae1825a08d573d4d159ac2b72df5912e1ded69 WatchSource:0}: Error finding container e6730668e323bea0e505a196f5ae1825a08d573d4d159ac2b72df5912e1ded69: Status 404 returned error can't find the container with id e6730668e323bea0e505a196f5ae1825a08d573d4d159ac2b72df5912e1ded69 Mar 07 07:11:56 crc kubenswrapper[4941]: I0307 07:11:56.589146 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-346c-account-create-update-94z8w"] Mar 07 07:11:56 crc kubenswrapper[4941]: I0307 07:11:56.924610 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.002688 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-7bc8b"] Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.002900 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" podUID="b677cb4e-34de-4c2e-a9b9-507597162fa4" containerName="dnsmasq-dns" containerID="cri-o://2968ed0672bf074989ea855c6a0d259bc1a63da30856401b4384899242b4ff0b" gracePeriod=10 Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.194491 4941 generic.go:334] "Generic (PLEG): container finished" podID="72be4758-3939-4551-89be-4927ddb81638" containerID="9f3a6b72f7f858b3900528d9ee4a3d8a16ca3a77b9b2764fdcf03670e1821e59" exitCode=0 Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.195057 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pfpsm" event={"ID":"72be4758-3939-4551-89be-4927ddb81638","Type":"ContainerDied","Data":"9f3a6b72f7f858b3900528d9ee4a3d8a16ca3a77b9b2764fdcf03670e1821e59"} Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.195115 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pfpsm" event={"ID":"72be4758-3939-4551-89be-4927ddb81638","Type":"ContainerStarted","Data":"cb1b08d295983037d8476184459d818abe150e5e955b843b04eef27ecc4e5545"} Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.200258 4941 generic.go:334] "Generic (PLEG): container finished" podID="0491b032-0a65-4d6e-904e-b464a0acfcda" containerID="2585f0f88683c18d6231df3938f2dc939959c3827623adfcebb1e2c4de47e762" exitCode=0 Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.200356 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5ft6f" event={"ID":"0491b032-0a65-4d6e-904e-b464a0acfcda","Type":"ContainerDied","Data":"2585f0f88683c18d6231df3938f2dc939959c3827623adfcebb1e2c4de47e762"} Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.200489 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5ft6f" event={"ID":"0491b032-0a65-4d6e-904e-b464a0acfcda","Type":"ContainerStarted","Data":"62b42752ec459e834f8f7eac23d0ccc24bfbaa24a55335f94f19e9a56a3dce6a"} Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.214244 4941 generic.go:334] "Generic (PLEG): container finished" podID="b677cb4e-34de-4c2e-a9b9-507597162fa4" containerID="2968ed0672bf074989ea855c6a0d259bc1a63da30856401b4384899242b4ff0b" exitCode=0 Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.214357 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" event={"ID":"b677cb4e-34de-4c2e-a9b9-507597162fa4","Type":"ContainerDied","Data":"2968ed0672bf074989ea855c6a0d259bc1a63da30856401b4384899242b4ff0b"} Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.220618 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-346c-account-create-update-94z8w" event={"ID":"2dc484b5-13a7-48df-a417-3f04600f9320","Type":"ContainerStarted","Data":"56166620984b0f0801be41176baf5cfe9ea22ccae5626354a2b64a1dbe53dc43"} Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.220652 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-346c-account-create-update-94z8w" event={"ID":"2dc484b5-13a7-48df-a417-3f04600f9320","Type":"ContainerStarted","Data":"e6730668e323bea0e505a196f5ae1825a08d573d4d159ac2b72df5912e1ded69"} Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.221802 4941 generic.go:334] "Generic (PLEG): container finished" podID="38422d86-9fa3-4547-a810-106f783ac38a" containerID="11047d81854f9312c66dbdfc8c3e2ef41e2db704c8c78fd59294ed3fb616fe1c" exitCode=0 Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.221857 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-582b-account-create-update-c9h8k" event={"ID":"38422d86-9fa3-4547-a810-106f783ac38a","Type":"ContainerDied","Data":"11047d81854f9312c66dbdfc8c3e2ef41e2db704c8c78fd59294ed3fb616fe1c"} Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.221891 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-582b-account-create-update-c9h8k" event={"ID":"38422d86-9fa3-4547-a810-106f783ac38a","Type":"ContainerStarted","Data":"ab706146b0560183c22387e5667b78ee83746d6bd779efc8f5da481eec9b6af5"} Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.440897 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.563392 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b677cb4e-34de-4c2e-a9b9-507597162fa4-config\") pod \"b677cb4e-34de-4c2e-a9b9-507597162fa4\" (UID: \"b677cb4e-34de-4c2e-a9b9-507597162fa4\") " Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.563532 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b677cb4e-34de-4c2e-a9b9-507597162fa4-dns-svc\") pod \"b677cb4e-34de-4c2e-a9b9-507597162fa4\" (UID: \"b677cb4e-34de-4c2e-a9b9-507597162fa4\") " Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.563562 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7m72\" (UniqueName: \"kubernetes.io/projected/b677cb4e-34de-4c2e-a9b9-507597162fa4-kube-api-access-s7m72\") pod \"b677cb4e-34de-4c2e-a9b9-507597162fa4\" (UID: \"b677cb4e-34de-4c2e-a9b9-507597162fa4\") " Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.569197 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b677cb4e-34de-4c2e-a9b9-507597162fa4-kube-api-access-s7m72" (OuterVolumeSpecName: "kube-api-access-s7m72") pod "b677cb4e-34de-4c2e-a9b9-507597162fa4" (UID: "b677cb4e-34de-4c2e-a9b9-507597162fa4"). InnerVolumeSpecName "kube-api-access-s7m72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.601014 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b677cb4e-34de-4c2e-a9b9-507597162fa4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b677cb4e-34de-4c2e-a9b9-507597162fa4" (UID: "b677cb4e-34de-4c2e-a9b9-507597162fa4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.603991 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b677cb4e-34de-4c2e-a9b9-507597162fa4-config" (OuterVolumeSpecName: "config") pod "b677cb4e-34de-4c2e-a9b9-507597162fa4" (UID: "b677cb4e-34de-4c2e-a9b9-507597162fa4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.665758 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b677cb4e-34de-4c2e-a9b9-507597162fa4-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.665794 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b677cb4e-34de-4c2e-a9b9-507597162fa4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.665806 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7m72\" (UniqueName: \"kubernetes.io/projected/b677cb4e-34de-4c2e-a9b9-507597162fa4-kube-api-access-s7m72\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.829668 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-46v2p"] Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.836624 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-46v2p"] Mar 07 07:11:57 crc kubenswrapper[4941]: I0307 07:11:57.968062 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba28e625-ebc4-4dc0-a59b-539fa676ce8f" path="/var/lib/kubelet/pods/ba28e625-ebc4-4dc0-a59b-539fa676ce8f/volumes" Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.231176 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" event={"ID":"b677cb4e-34de-4c2e-a9b9-507597162fa4","Type":"ContainerDied","Data":"6d7313e1258e97c687796c2ac717292a880c1c691e67dde6766bef71701087a5"} Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.231610 4941 scope.go:117] "RemoveContainer" containerID="2968ed0672bf074989ea855c6a0d259bc1a63da30856401b4384899242b4ff0b" Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.231189 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-7bc8b" Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.233822 4941 generic.go:334] "Generic (PLEG): container finished" podID="2dc484b5-13a7-48df-a417-3f04600f9320" containerID="56166620984b0f0801be41176baf5cfe9ea22ccae5626354a2b64a1dbe53dc43" exitCode=0 Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.233940 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-346c-account-create-update-94z8w" event={"ID":"2dc484b5-13a7-48df-a417-3f04600f9320","Type":"ContainerDied","Data":"56166620984b0f0801be41176baf5cfe9ea22ccae5626354a2b64a1dbe53dc43"} Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.281852 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-7bc8b"] Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.286736 4941 scope.go:117] "RemoveContainer" containerID="1ac2a08ce8397c02e977b41a27ecbafc7f3363e7afebad69f337aaa3258115e1" Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.302051 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-7bc8b"] Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.656552 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5ft6f" Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.795897 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0491b032-0a65-4d6e-904e-b464a0acfcda-operator-scripts\") pod \"0491b032-0a65-4d6e-904e-b464a0acfcda\" (UID: \"0491b032-0a65-4d6e-904e-b464a0acfcda\") " Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.796109 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlj4s\" (UniqueName: \"kubernetes.io/projected/0491b032-0a65-4d6e-904e-b464a0acfcda-kube-api-access-nlj4s\") pod \"0491b032-0a65-4d6e-904e-b464a0acfcda\" (UID: \"0491b032-0a65-4d6e-904e-b464a0acfcda\") " Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.796395 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0491b032-0a65-4d6e-904e-b464a0acfcda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0491b032-0a65-4d6e-904e-b464a0acfcda" (UID: "0491b032-0a65-4d6e-904e-b464a0acfcda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.802970 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0491b032-0a65-4d6e-904e-b464a0acfcda-kube-api-access-nlj4s" (OuterVolumeSpecName: "kube-api-access-nlj4s") pod "0491b032-0a65-4d6e-904e-b464a0acfcda" (UID: "0491b032-0a65-4d6e-904e-b464a0acfcda"). InnerVolumeSpecName "kube-api-access-nlj4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.841356 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-582b-account-create-update-c9h8k" Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.854907 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-346c-account-create-update-94z8w" Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.877082 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pfpsm" Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.897865 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlj4s\" (UniqueName: \"kubernetes.io/projected/0491b032-0a65-4d6e-904e-b464a0acfcda-kube-api-access-nlj4s\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:58 crc kubenswrapper[4941]: I0307 07:11:58.897905 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0491b032-0a65-4d6e-904e-b464a0acfcda-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:58.999379 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc484b5-13a7-48df-a417-3f04600f9320-operator-scripts\") pod \"2dc484b5-13a7-48df-a417-3f04600f9320\" (UID: \"2dc484b5-13a7-48df-a417-3f04600f9320\") " Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:58.999438 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnxdb\" (UniqueName: \"kubernetes.io/projected/2dc484b5-13a7-48df-a417-3f04600f9320-kube-api-access-tnxdb\") pod \"2dc484b5-13a7-48df-a417-3f04600f9320\" (UID: \"2dc484b5-13a7-48df-a417-3f04600f9320\") " Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:58.999465 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv6xv\" (UniqueName: \"kubernetes.io/projected/38422d86-9fa3-4547-a810-106f783ac38a-kube-api-access-gv6xv\") pod \"38422d86-9fa3-4547-a810-106f783ac38a\" (UID: \"38422d86-9fa3-4547-a810-106f783ac38a\") " Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:58.999576 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72be4758-3939-4551-89be-4927ddb81638-operator-scripts\") pod \"72be4758-3939-4551-89be-4927ddb81638\" (UID: \"72be4758-3939-4551-89be-4927ddb81638\") " Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:58.999609 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38422d86-9fa3-4547-a810-106f783ac38a-operator-scripts\") pod \"38422d86-9fa3-4547-a810-106f783ac38a\" (UID: \"38422d86-9fa3-4547-a810-106f783ac38a\") " Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:58.999653 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45dh8\" (UniqueName: \"kubernetes.io/projected/72be4758-3939-4551-89be-4927ddb81638-kube-api-access-45dh8\") pod \"72be4758-3939-4551-89be-4927ddb81638\" (UID: \"72be4758-3939-4551-89be-4927ddb81638\") " Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:58.999854 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dc484b5-13a7-48df-a417-3f04600f9320-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2dc484b5-13a7-48df-a417-3f04600f9320" (UID: "2dc484b5-13a7-48df-a417-3f04600f9320"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:58.999966 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc484b5-13a7-48df-a417-3f04600f9320-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.000286 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72be4758-3939-4551-89be-4927ddb81638-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72be4758-3939-4551-89be-4927ddb81638" (UID: "72be4758-3939-4551-89be-4927ddb81638"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.001020 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38422d86-9fa3-4547-a810-106f783ac38a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38422d86-9fa3-4547-a810-106f783ac38a" (UID: "38422d86-9fa3-4547-a810-106f783ac38a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.002060 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc484b5-13a7-48df-a417-3f04600f9320-kube-api-access-tnxdb" (OuterVolumeSpecName: "kube-api-access-tnxdb") pod "2dc484b5-13a7-48df-a417-3f04600f9320" (UID: "2dc484b5-13a7-48df-a417-3f04600f9320"). InnerVolumeSpecName "kube-api-access-tnxdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.003142 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72be4758-3939-4551-89be-4927ddb81638-kube-api-access-45dh8" (OuterVolumeSpecName: "kube-api-access-45dh8") pod "72be4758-3939-4551-89be-4927ddb81638" (UID: "72be4758-3939-4551-89be-4927ddb81638"). InnerVolumeSpecName "kube-api-access-45dh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.006956 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38422d86-9fa3-4547-a810-106f783ac38a-kube-api-access-gv6xv" (OuterVolumeSpecName: "kube-api-access-gv6xv") pod "38422d86-9fa3-4547-a810-106f783ac38a" (UID: "38422d86-9fa3-4547-a810-106f783ac38a"). InnerVolumeSpecName "kube-api-access-gv6xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.102686 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnxdb\" (UniqueName: \"kubernetes.io/projected/2dc484b5-13a7-48df-a417-3f04600f9320-kube-api-access-tnxdb\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.102749 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv6xv\" (UniqueName: \"kubernetes.io/projected/38422d86-9fa3-4547-a810-106f783ac38a-kube-api-access-gv6xv\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.102769 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72be4758-3939-4551-89be-4927ddb81638-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.102787 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38422d86-9fa3-4547-a810-106f783ac38a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.102846 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45dh8\" (UniqueName: \"kubernetes.io/projected/72be4758-3939-4551-89be-4927ddb81638-kube-api-access-45dh8\") on node \"crc\" DevicePath \"\"" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.244291 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5ft6f" event={"ID":"0491b032-0a65-4d6e-904e-b464a0acfcda","Type":"ContainerDied","Data":"62b42752ec459e834f8f7eac23d0ccc24bfbaa24a55335f94f19e9a56a3dce6a"} Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.244708 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62b42752ec459e834f8f7eac23d0ccc24bfbaa24a55335f94f19e9a56a3dce6a" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.244447 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5ft6f" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.248180 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-346c-account-create-update-94z8w" event={"ID":"2dc484b5-13a7-48df-a417-3f04600f9320","Type":"ContainerDied","Data":"e6730668e323bea0e505a196f5ae1825a08d573d4d159ac2b72df5912e1ded69"} Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.248225 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6730668e323bea0e505a196f5ae1825a08d573d4d159ac2b72df5912e1ded69" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.248285 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-346c-account-create-update-94z8w" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.253935 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-582b-account-create-update-c9h8k" event={"ID":"38422d86-9fa3-4547-a810-106f783ac38a","Type":"ContainerDied","Data":"ab706146b0560183c22387e5667b78ee83746d6bd779efc8f5da481eec9b6af5"} Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.253974 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab706146b0560183c22387e5667b78ee83746d6bd779efc8f5da481eec9b6af5" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.253958 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-582b-account-create-update-c9h8k" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.257182 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pfpsm" event={"ID":"72be4758-3939-4551-89be-4927ddb81638","Type":"ContainerDied","Data":"cb1b08d295983037d8476184459d818abe150e5e955b843b04eef27ecc4e5545"} Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.257220 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb1b08d295983037d8476184459d818abe150e5e955b843b04eef27ecc4e5545" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.257228 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pfpsm" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.672398 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-cwhp5"] Mar 07 07:11:59 crc kubenswrapper[4941]: E0307 07:11:59.673234 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72be4758-3939-4551-89be-4927ddb81638" containerName="mariadb-database-create" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.673250 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="72be4758-3939-4551-89be-4927ddb81638" containerName="mariadb-database-create" Mar 07 07:11:59 crc kubenswrapper[4941]: E0307 07:11:59.673270 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc484b5-13a7-48df-a417-3f04600f9320" containerName="mariadb-account-create-update" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.673277 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc484b5-13a7-48df-a417-3f04600f9320" containerName="mariadb-account-create-update" Mar 07 07:11:59 crc kubenswrapper[4941]: E0307 07:11:59.673291 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b677cb4e-34de-4c2e-a9b9-507597162fa4" containerName="dnsmasq-dns" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.673301 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b677cb4e-34de-4c2e-a9b9-507597162fa4" containerName="dnsmasq-dns" Mar 07 07:11:59 crc kubenswrapper[4941]: E0307 07:11:59.673314 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38422d86-9fa3-4547-a810-106f783ac38a" containerName="mariadb-account-create-update" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.673323 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="38422d86-9fa3-4547-a810-106f783ac38a" containerName="mariadb-account-create-update" Mar 07 07:11:59 crc kubenswrapper[4941]: E0307 07:11:59.673339 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b677cb4e-34de-4c2e-a9b9-507597162fa4" containerName="init" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.673346 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b677cb4e-34de-4c2e-a9b9-507597162fa4" containerName="init" Mar 07 07:11:59 crc kubenswrapper[4941]: E0307 07:11:59.673367 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0491b032-0a65-4d6e-904e-b464a0acfcda" containerName="mariadb-database-create" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.673374 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="0491b032-0a65-4d6e-904e-b464a0acfcda" containerName="mariadb-database-create" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.673558 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="0491b032-0a65-4d6e-904e-b464a0acfcda" containerName="mariadb-database-create" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.673572 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="38422d86-9fa3-4547-a810-106f783ac38a" containerName="mariadb-account-create-update" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.673587 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="b677cb4e-34de-4c2e-a9b9-507597162fa4" containerName="dnsmasq-dns" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.673597 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc484b5-13a7-48df-a417-3f04600f9320" containerName="mariadb-account-create-update" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.673603 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="72be4758-3939-4551-89be-4927ddb81638" containerName="mariadb-database-create" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.676677 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cwhp5" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.697652 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cwhp5"] Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.755060 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-040c-account-create-update-6phkj"] Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.756285 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-040c-account-create-update-6phkj" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.758152 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.766567 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-040c-account-create-update-6phkj"] Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.819143 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/471729e9-1d55-4a19-9fc7-2a5313410c46-operator-scripts\") pod \"glance-db-create-cwhp5\" (UID: \"471729e9-1d55-4a19-9fc7-2a5313410c46\") " pod="openstack/glance-db-create-cwhp5" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.819208 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llnh7\" (UniqueName: \"kubernetes.io/projected/471729e9-1d55-4a19-9fc7-2a5313410c46-kube-api-access-llnh7\") pod \"glance-db-create-cwhp5\" (UID: \"471729e9-1d55-4a19-9fc7-2a5313410c46\") " pod="openstack/glance-db-create-cwhp5" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.921065 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9v77\" (UniqueName: \"kubernetes.io/projected/f561ebac-b036-4d82-8e7a-2a43b031c0ba-kube-api-access-t9v77\") pod \"glance-040c-account-create-update-6phkj\" (UID: \"f561ebac-b036-4d82-8e7a-2a43b031c0ba\") " pod="openstack/glance-040c-account-create-update-6phkj" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.921240 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/471729e9-1d55-4a19-9fc7-2a5313410c46-operator-scripts\") pod \"glance-db-create-cwhp5\" (UID: \"471729e9-1d55-4a19-9fc7-2a5313410c46\") " pod="openstack/glance-db-create-cwhp5" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.921295 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f561ebac-b036-4d82-8e7a-2a43b031c0ba-operator-scripts\") pod \"glance-040c-account-create-update-6phkj\" (UID: \"f561ebac-b036-4d82-8e7a-2a43b031c0ba\") " pod="openstack/glance-040c-account-create-update-6phkj" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.921353 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llnh7\" (UniqueName: \"kubernetes.io/projected/471729e9-1d55-4a19-9fc7-2a5313410c46-kube-api-access-llnh7\") pod \"glance-db-create-cwhp5\" (UID: \"471729e9-1d55-4a19-9fc7-2a5313410c46\") " pod="openstack/glance-db-create-cwhp5" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.922225 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/471729e9-1d55-4a19-9fc7-2a5313410c46-operator-scripts\") pod \"glance-db-create-cwhp5\" (UID: \"471729e9-1d55-4a19-9fc7-2a5313410c46\") " pod="openstack/glance-db-create-cwhp5" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.940928 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llnh7\" (UniqueName: \"kubernetes.io/projected/471729e9-1d55-4a19-9fc7-2a5313410c46-kube-api-access-llnh7\") pod \"glance-db-create-cwhp5\" (UID: \"471729e9-1d55-4a19-9fc7-2a5313410c46\") " pod="openstack/glance-db-create-cwhp5" Mar 07 07:11:59 crc kubenswrapper[4941]: I0307 07:11:59.974486 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b677cb4e-34de-4c2e-a9b9-507597162fa4" path="/var/lib/kubelet/pods/b677cb4e-34de-4c2e-a9b9-507597162fa4/volumes" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.017274 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cwhp5" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.024194 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9v77\" (UniqueName: \"kubernetes.io/projected/f561ebac-b036-4d82-8e7a-2a43b031c0ba-kube-api-access-t9v77\") pod \"glance-040c-account-create-update-6phkj\" (UID: \"f561ebac-b036-4d82-8e7a-2a43b031c0ba\") " pod="openstack/glance-040c-account-create-update-6phkj" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.024325 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f561ebac-b036-4d82-8e7a-2a43b031c0ba-operator-scripts\") pod \"glance-040c-account-create-update-6phkj\" (UID: \"f561ebac-b036-4d82-8e7a-2a43b031c0ba\") " pod="openstack/glance-040c-account-create-update-6phkj" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.025279 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f561ebac-b036-4d82-8e7a-2a43b031c0ba-operator-scripts\") pod \"glance-040c-account-create-update-6phkj\" (UID: \"f561ebac-b036-4d82-8e7a-2a43b031c0ba\") " pod="openstack/glance-040c-account-create-update-6phkj" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.069807 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9v77\" (UniqueName: \"kubernetes.io/projected/f561ebac-b036-4d82-8e7a-2a43b031c0ba-kube-api-access-t9v77\") pod \"glance-040c-account-create-update-6phkj\" (UID: \"f561ebac-b036-4d82-8e7a-2a43b031c0ba\") " pod="openstack/glance-040c-account-create-update-6phkj" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.076469 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-040c-account-create-update-6phkj" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.140459 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547792-rq6d7"] Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.143679 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547792-rq6d7" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.146538 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.146642 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.146710 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.151755 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547792-rq6d7"] Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.228287 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmhnt\" (UniqueName: \"kubernetes.io/projected/03901d36-4348-43da-ad11-7592d9dd31e6-kube-api-access-nmhnt\") pod \"auto-csr-approver-29547792-rq6d7\" (UID: \"03901d36-4348-43da-ad11-7592d9dd31e6\") " pod="openshift-infra/auto-csr-approver-29547792-rq6d7" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.330073 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmhnt\" (UniqueName: \"kubernetes.io/projected/03901d36-4348-43da-ad11-7592d9dd31e6-kube-api-access-nmhnt\") pod \"auto-csr-approver-29547792-rq6d7\" (UID: \"03901d36-4348-43da-ad11-7592d9dd31e6\") " pod="openshift-infra/auto-csr-approver-29547792-rq6d7" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.358215 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmhnt\" (UniqueName: \"kubernetes.io/projected/03901d36-4348-43da-ad11-7592d9dd31e6-kube-api-access-nmhnt\") pod \"auto-csr-approver-29547792-rq6d7\" (UID: \"03901d36-4348-43da-ad11-7592d9dd31e6\") " pod="openshift-infra/auto-csr-approver-29547792-rq6d7" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.508323 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547792-rq6d7" Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.576689 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cwhp5"] Mar 07 07:12:00 crc kubenswrapper[4941]: W0307 07:12:00.582591 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod471729e9_1d55_4a19_9fc7_2a5313410c46.slice/crio-6d9e58781eed8eb42ddf4147c018b83634ecfa9cd21dcf51ef37f5098cf2b423 WatchSource:0}: Error finding container 6d9e58781eed8eb42ddf4147c018b83634ecfa9cd21dcf51ef37f5098cf2b423: Status 404 returned error can't find the container with id 6d9e58781eed8eb42ddf4147c018b83634ecfa9cd21dcf51ef37f5098cf2b423 Mar 07 07:12:00 crc kubenswrapper[4941]: I0307 07:12:00.690350 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-040c-account-create-update-6phkj"] Mar 07 07:12:00 crc kubenswrapper[4941]: W0307 07:12:00.690504 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf561ebac_b036_4d82_8e7a_2a43b031c0ba.slice/crio-5da59d1c2f1606959551edf6a0cb34f631e4545cd65361fb26be6cad68434de7 WatchSource:0}: Error finding container 5da59d1c2f1606959551edf6a0cb34f631e4545cd65361fb26be6cad68434de7: Status 404 returned error can't find the container with id 5da59d1c2f1606959551edf6a0cb34f631e4545cd65361fb26be6cad68434de7 Mar 07 07:12:01 crc kubenswrapper[4941]: I0307 07:12:01.004705 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547792-rq6d7"] Mar 07 07:12:01 crc kubenswrapper[4941]: W0307 07:12:01.035670 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03901d36_4348_43da_ad11_7592d9dd31e6.slice/crio-636c21093b812c453bacba3ccdcff470dd583dfe174b805c7c984027c3b57276 WatchSource:0}: Error finding container 636c21093b812c453bacba3ccdcff470dd583dfe174b805c7c984027c3b57276: Status 404 returned error can't find the container with id 636c21093b812c453bacba3ccdcff470dd583dfe174b805c7c984027c3b57276 Mar 07 07:12:01 crc kubenswrapper[4941]: I0307 07:12:01.284684 4941 generic.go:334] "Generic (PLEG): container finished" podID="471729e9-1d55-4a19-9fc7-2a5313410c46" containerID="f261bcf1dd456507063c04b9d5073baa9a160e87a09f3c50d3c9076190c31770" exitCode=0 Mar 07 07:12:01 crc kubenswrapper[4941]: I0307 07:12:01.284780 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cwhp5" event={"ID":"471729e9-1d55-4a19-9fc7-2a5313410c46","Type":"ContainerDied","Data":"f261bcf1dd456507063c04b9d5073baa9a160e87a09f3c50d3c9076190c31770"} Mar 07 07:12:01 crc kubenswrapper[4941]: I0307 07:12:01.284908 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cwhp5" event={"ID":"471729e9-1d55-4a19-9fc7-2a5313410c46","Type":"ContainerStarted","Data":"6d9e58781eed8eb42ddf4147c018b83634ecfa9cd21dcf51ef37f5098cf2b423"} Mar 07 07:12:01 crc kubenswrapper[4941]: I0307 07:12:01.286857 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547792-rq6d7" event={"ID":"03901d36-4348-43da-ad11-7592d9dd31e6","Type":"ContainerStarted","Data":"636c21093b812c453bacba3ccdcff470dd583dfe174b805c7c984027c3b57276"} Mar 07 07:12:01 crc kubenswrapper[4941]: I0307 07:12:01.290979 4941 generic.go:334] "Generic (PLEG): container finished" podID="f561ebac-b036-4d82-8e7a-2a43b031c0ba" containerID="f9de7cd1754d7c2a737281737d75a1949c7b38949c7e6b30706e3ab775e70345" exitCode=0 Mar 07 07:12:01 crc kubenswrapper[4941]: I0307 07:12:01.291020 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-040c-account-create-update-6phkj" event={"ID":"f561ebac-b036-4d82-8e7a-2a43b031c0ba","Type":"ContainerDied","Data":"f9de7cd1754d7c2a737281737d75a1949c7b38949c7e6b30706e3ab775e70345"} Mar 07 07:12:01 crc kubenswrapper[4941]: I0307 07:12:01.291041 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-040c-account-create-update-6phkj" event={"ID":"f561ebac-b036-4d82-8e7a-2a43b031c0ba","Type":"ContainerStarted","Data":"5da59d1c2f1606959551edf6a0cb34f631e4545cd65361fb26be6cad68434de7"} Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.715702 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-040c-account-create-update-6phkj" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.720147 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cwhp5" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.774525 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llnh7\" (UniqueName: \"kubernetes.io/projected/471729e9-1d55-4a19-9fc7-2a5313410c46-kube-api-access-llnh7\") pod \"471729e9-1d55-4a19-9fc7-2a5313410c46\" (UID: \"471729e9-1d55-4a19-9fc7-2a5313410c46\") " Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.774654 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9v77\" (UniqueName: \"kubernetes.io/projected/f561ebac-b036-4d82-8e7a-2a43b031c0ba-kube-api-access-t9v77\") pod \"f561ebac-b036-4d82-8e7a-2a43b031c0ba\" (UID: \"f561ebac-b036-4d82-8e7a-2a43b031c0ba\") " Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.774764 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/471729e9-1d55-4a19-9fc7-2a5313410c46-operator-scripts\") pod \"471729e9-1d55-4a19-9fc7-2a5313410c46\" (UID: \"471729e9-1d55-4a19-9fc7-2a5313410c46\") " Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.774814 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f561ebac-b036-4d82-8e7a-2a43b031c0ba-operator-scripts\") pod \"f561ebac-b036-4d82-8e7a-2a43b031c0ba\" (UID: \"f561ebac-b036-4d82-8e7a-2a43b031c0ba\") " Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.775551 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/471729e9-1d55-4a19-9fc7-2a5313410c46-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "471729e9-1d55-4a19-9fc7-2a5313410c46" (UID: "471729e9-1d55-4a19-9fc7-2a5313410c46"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.775570 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f561ebac-b036-4d82-8e7a-2a43b031c0ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f561ebac-b036-4d82-8e7a-2a43b031c0ba" (UID: "f561ebac-b036-4d82-8e7a-2a43b031c0ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.781528 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/471729e9-1d55-4a19-9fc7-2a5313410c46-kube-api-access-llnh7" (OuterVolumeSpecName: "kube-api-access-llnh7") pod "471729e9-1d55-4a19-9fc7-2a5313410c46" (UID: "471729e9-1d55-4a19-9fc7-2a5313410c46"). InnerVolumeSpecName "kube-api-access-llnh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.799701 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f561ebac-b036-4d82-8e7a-2a43b031c0ba-kube-api-access-t9v77" (OuterVolumeSpecName: "kube-api-access-t9v77") pod "f561ebac-b036-4d82-8e7a-2a43b031c0ba" (UID: "f561ebac-b036-4d82-8e7a-2a43b031c0ba"). InnerVolumeSpecName "kube-api-access-t9v77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.848306 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-972xm"] Mar 07 07:12:02 crc kubenswrapper[4941]: E0307 07:12:02.848678 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="471729e9-1d55-4a19-9fc7-2a5313410c46" containerName="mariadb-database-create" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.848690 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="471729e9-1d55-4a19-9fc7-2a5313410c46" containerName="mariadb-database-create" Mar 07 07:12:02 crc kubenswrapper[4941]: E0307 07:12:02.848713 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f561ebac-b036-4d82-8e7a-2a43b031c0ba" containerName="mariadb-account-create-update" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.848721 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f561ebac-b036-4d82-8e7a-2a43b031c0ba" containerName="mariadb-account-create-update" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.848920 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f561ebac-b036-4d82-8e7a-2a43b031c0ba" containerName="mariadb-account-create-update" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.848941 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="471729e9-1d55-4a19-9fc7-2a5313410c46" containerName="mariadb-database-create" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.850140 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-972xm" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.851973 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.855867 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-972xm"] Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.876483 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0adf58-bd20-433a-a80c-0a871ec201b4-operator-scripts\") pod \"root-account-create-update-972xm\" (UID: \"bf0adf58-bd20-433a-a80c-0a871ec201b4\") " pod="openstack/root-account-create-update-972xm" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.876708 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qhlc\" (UniqueName: \"kubernetes.io/projected/bf0adf58-bd20-433a-a80c-0a871ec201b4-kube-api-access-9qhlc\") pod \"root-account-create-update-972xm\" (UID: \"bf0adf58-bd20-433a-a80c-0a871ec201b4\") " pod="openstack/root-account-create-update-972xm" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.876910 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9v77\" (UniqueName: \"kubernetes.io/projected/f561ebac-b036-4d82-8e7a-2a43b031c0ba-kube-api-access-t9v77\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.876997 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/471729e9-1d55-4a19-9fc7-2a5313410c46-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.877077 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f561ebac-b036-4d82-8e7a-2a43b031c0ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.877158 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llnh7\" (UniqueName: \"kubernetes.io/projected/471729e9-1d55-4a19-9fc7-2a5313410c46-kube-api-access-llnh7\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.978380 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0adf58-bd20-433a-a80c-0a871ec201b4-operator-scripts\") pod \"root-account-create-update-972xm\" (UID: \"bf0adf58-bd20-433a-a80c-0a871ec201b4\") " pod="openstack/root-account-create-update-972xm" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.978699 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qhlc\" (UniqueName: \"kubernetes.io/projected/bf0adf58-bd20-433a-a80c-0a871ec201b4-kube-api-access-9qhlc\") pod \"root-account-create-update-972xm\" (UID: \"bf0adf58-bd20-433a-a80c-0a871ec201b4\") " pod="openstack/root-account-create-update-972xm" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.979520 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0adf58-bd20-433a-a80c-0a871ec201b4-operator-scripts\") pod \"root-account-create-update-972xm\" (UID: \"bf0adf58-bd20-433a-a80c-0a871ec201b4\") " pod="openstack/root-account-create-update-972xm" Mar 07 07:12:02 crc kubenswrapper[4941]: I0307 07:12:02.997385 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qhlc\" (UniqueName: \"kubernetes.io/projected/bf0adf58-bd20-433a-a80c-0a871ec201b4-kube-api-access-9qhlc\") pod \"root-account-create-update-972xm\" (UID: \"bf0adf58-bd20-433a-a80c-0a871ec201b4\") " pod="openstack/root-account-create-update-972xm" Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.180654 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-972xm" Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.328249 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cwhp5" Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.328277 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cwhp5" event={"ID":"471729e9-1d55-4a19-9fc7-2a5313410c46","Type":"ContainerDied","Data":"6d9e58781eed8eb42ddf4147c018b83634ecfa9cd21dcf51ef37f5098cf2b423"} Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.328771 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d9e58781eed8eb42ddf4147c018b83634ecfa9cd21dcf51ef37f5098cf2b423" Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.333548 4941 generic.go:334] "Generic (PLEG): container finished" podID="04152996-2000-4188-840c-1759d193c903" containerID="8fc805bc99c0c8af89d3b1cb58369ff1e706429e415c177a6ab724b2d108401f" exitCode=0 Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.333671 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m9v9g" event={"ID":"04152996-2000-4188-840c-1759d193c903","Type":"ContainerDied","Data":"8fc805bc99c0c8af89d3b1cb58369ff1e706429e415c177a6ab724b2d108401f"} Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.336226 4941 generic.go:334] "Generic (PLEG): container finished" podID="03901d36-4348-43da-ad11-7592d9dd31e6" containerID="81b976eb55b844a9710c1ad3933aa492f5bc903c7f6a05eb196b62de0c30c0c5" exitCode=0 Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.336306 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547792-rq6d7" event={"ID":"03901d36-4348-43da-ad11-7592d9dd31e6","Type":"ContainerDied","Data":"81b976eb55b844a9710c1ad3933aa492f5bc903c7f6a05eb196b62de0c30c0c5"} Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.338616 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-040c-account-create-update-6phkj" Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.338632 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-040c-account-create-update-6phkj" event={"ID":"f561ebac-b036-4d82-8e7a-2a43b031c0ba","Type":"ContainerDied","Data":"5da59d1c2f1606959551edf6a0cb34f631e4545cd65361fb26be6cad68434de7"} Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.338668 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5da59d1c2f1606959551edf6a0cb34f631e4545cd65361fb26be6cad68434de7" Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.589879 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.597079 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift\") pod \"swift-storage-0\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " pod="openstack/swift-storage-0" Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.633344 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 07 07:12:03 crc kubenswrapper[4941]: I0307 07:12:03.710862 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-972xm"] Mar 07 07:12:03 crc kubenswrapper[4941]: W0307 07:12:03.715008 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf0adf58_bd20_433a_a80c_0a871ec201b4.slice/crio-5e866d739f9ee1e83b51594afba57ee3085fda28531d16199306ac3a12d46641 WatchSource:0}: Error finding container 5e866d739f9ee1e83b51594afba57ee3085fda28531d16199306ac3a12d46641: Status 404 returned error can't find the container with id 5e866d739f9ee1e83b51594afba57ee3085fda28531d16199306ac3a12d46641 Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.279030 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 07 07:12:04 crc kubenswrapper[4941]: W0307 07:12:04.289548 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a5f223a_7907_42a5_954b_fafc3c4b78da.slice/crio-da1654ea95b5022cff095825806dfbe23192b7b16b6b327904fd76eee3f987d1 WatchSource:0}: Error finding container da1654ea95b5022cff095825806dfbe23192b7b16b6b327904fd76eee3f987d1: Status 404 returned error can't find the container with id da1654ea95b5022cff095825806dfbe23192b7b16b6b327904fd76eee3f987d1 Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.346025 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"da1654ea95b5022cff095825806dfbe23192b7b16b6b327904fd76eee3f987d1"} Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.347980 4941 generic.go:334] "Generic (PLEG): container finished" podID="bf0adf58-bd20-433a-a80c-0a871ec201b4" containerID="f29e9223a03236b2dc5bc79ce9ec2bcfcd1509d865de72b65c0643b5a816cb06" exitCode=0 Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.348473 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-972xm" event={"ID":"bf0adf58-bd20-433a-a80c-0a871ec201b4","Type":"ContainerDied","Data":"f29e9223a03236b2dc5bc79ce9ec2bcfcd1509d865de72b65c0643b5a816cb06"} Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.348496 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-972xm" event={"ID":"bf0adf58-bd20-433a-a80c-0a871ec201b4","Type":"ContainerStarted","Data":"5e866d739f9ee1e83b51594afba57ee3085fda28531d16199306ac3a12d46641"} Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.633309 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547792-rq6d7" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.710109 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmhnt\" (UniqueName: \"kubernetes.io/projected/03901d36-4348-43da-ad11-7592d9dd31e6-kube-api-access-nmhnt\") pod \"03901d36-4348-43da-ad11-7592d9dd31e6\" (UID: \"03901d36-4348-43da-ad11-7592d9dd31e6\") " Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.714898 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03901d36-4348-43da-ad11-7592d9dd31e6-kube-api-access-nmhnt" (OuterVolumeSpecName: "kube-api-access-nmhnt") pod "03901d36-4348-43da-ad11-7592d9dd31e6" (UID: "03901d36-4348-43da-ad11-7592d9dd31e6"). InnerVolumeSpecName "kube-api-access-nmhnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.774688 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.813005 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtbqt\" (UniqueName: \"kubernetes.io/projected/04152996-2000-4188-840c-1759d193c903-kube-api-access-dtbqt\") pod \"04152996-2000-4188-840c-1759d193c903\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.813068 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-combined-ca-bundle\") pod \"04152996-2000-4188-840c-1759d193c903\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.813137 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-dispersionconf\") pod \"04152996-2000-4188-840c-1759d193c903\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.813163 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-swiftconf\") pod \"04152996-2000-4188-840c-1759d193c903\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.813228 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/04152996-2000-4188-840c-1759d193c903-ring-data-devices\") pod \"04152996-2000-4188-840c-1759d193c903\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.813257 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/04152996-2000-4188-840c-1759d193c903-etc-swift\") pod \"04152996-2000-4188-840c-1759d193c903\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.813325 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04152996-2000-4188-840c-1759d193c903-scripts\") pod \"04152996-2000-4188-840c-1759d193c903\" (UID: \"04152996-2000-4188-840c-1759d193c903\") " Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.813630 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmhnt\" (UniqueName: \"kubernetes.io/projected/03901d36-4348-43da-ad11-7592d9dd31e6-kube-api-access-nmhnt\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.814267 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04152996-2000-4188-840c-1759d193c903-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "04152996-2000-4188-840c-1759d193c903" (UID: "04152996-2000-4188-840c-1759d193c903"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.814904 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04152996-2000-4188-840c-1759d193c903-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "04152996-2000-4188-840c-1759d193c903" (UID: "04152996-2000-4188-840c-1759d193c903"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.819685 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04152996-2000-4188-840c-1759d193c903-kube-api-access-dtbqt" (OuterVolumeSpecName: "kube-api-access-dtbqt") pod "04152996-2000-4188-840c-1759d193c903" (UID: "04152996-2000-4188-840c-1759d193c903"). InnerVolumeSpecName "kube-api-access-dtbqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.821056 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "04152996-2000-4188-840c-1759d193c903" (UID: "04152996-2000-4188-840c-1759d193c903"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.833110 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04152996-2000-4188-840c-1759d193c903-scripts" (OuterVolumeSpecName: "scripts") pod "04152996-2000-4188-840c-1759d193c903" (UID: "04152996-2000-4188-840c-1759d193c903"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.847931 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04152996-2000-4188-840c-1759d193c903" (UID: "04152996-2000-4188-840c-1759d193c903"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.853378 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "04152996-2000-4188-840c-1759d193c903" (UID: "04152996-2000-4188-840c-1759d193c903"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.915888 4941 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.915938 4941 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/04152996-2000-4188-840c-1759d193c903-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.915961 4941 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/04152996-2000-4188-840c-1759d193c903-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.915980 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04152996-2000-4188-840c-1759d193c903-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.915999 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtbqt\" (UniqueName: \"kubernetes.io/projected/04152996-2000-4188-840c-1759d193c903-kube-api-access-dtbqt\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.916017 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:04 crc kubenswrapper[4941]: I0307 07:12:04.916037 4941 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/04152996-2000-4188-840c-1759d193c903-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.071272 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-tz4v7"] Mar 07 07:12:05 crc kubenswrapper[4941]: E0307 07:12:05.072067 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03901d36-4348-43da-ad11-7592d9dd31e6" containerName="oc" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.072094 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="03901d36-4348-43da-ad11-7592d9dd31e6" containerName="oc" Mar 07 07:12:05 crc kubenswrapper[4941]: E0307 07:12:05.072116 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04152996-2000-4188-840c-1759d193c903" containerName="swift-ring-rebalance" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.072125 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="04152996-2000-4188-840c-1759d193c903" containerName="swift-ring-rebalance" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.072352 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="03901d36-4348-43da-ad11-7592d9dd31e6" containerName="oc" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.072429 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="04152996-2000-4188-840c-1759d193c903" containerName="swift-ring-rebalance" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.073087 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.078107 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7jf2v" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.078127 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.083266 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tz4v7"] Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.118706 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-db-sync-config-data\") pod \"glance-db-sync-tz4v7\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.118900 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2w8w\" (UniqueName: \"kubernetes.io/projected/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-kube-api-access-h2w8w\") pod \"glance-db-sync-tz4v7\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.118948 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-combined-ca-bundle\") pod \"glance-db-sync-tz4v7\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.118985 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-config-data\") pod \"glance-db-sync-tz4v7\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.220687 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-config-data\") pod \"glance-db-sync-tz4v7\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.220816 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-db-sync-config-data\") pod \"glance-db-sync-tz4v7\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.221594 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2w8w\" (UniqueName: \"kubernetes.io/projected/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-kube-api-access-h2w8w\") pod \"glance-db-sync-tz4v7\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.222026 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-combined-ca-bundle\") pod \"glance-db-sync-tz4v7\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.227962 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-config-data\") pod \"glance-db-sync-tz4v7\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.228811 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-db-sync-config-data\") pod \"glance-db-sync-tz4v7\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.229595 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-combined-ca-bundle\") pod \"glance-db-sync-tz4v7\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.244816 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2w8w\" (UniqueName: \"kubernetes.io/projected/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-kube-api-access-h2w8w\") pod \"glance-db-sync-tz4v7\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.363793 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m9v9g" event={"ID":"04152996-2000-4188-840c-1759d193c903","Type":"ContainerDied","Data":"be4b09a954a1b9dbac4461b1545b18cf1d5d6f14d6793fdedc05fae9c67bfd36"} Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.363860 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be4b09a954a1b9dbac4461b1545b18cf1d5d6f14d6793fdedc05fae9c67bfd36" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.363824 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m9v9g" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.365741 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547792-rq6d7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.365781 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547792-rq6d7" event={"ID":"03901d36-4348-43da-ad11-7592d9dd31e6","Type":"ContainerDied","Data":"636c21093b812c453bacba3ccdcff470dd583dfe174b805c7c984027c3b57276"} Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.365829 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="636c21093b812c453bacba3ccdcff470dd583dfe174b805c7c984027c3b57276" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.402749 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.690781 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547786-dwbsp"] Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.721372 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547786-dwbsp"] Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.740366 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-x7fq9" podUID="5f4f0d58-e159-427f-8cca-95525d4968cd" containerName="ovn-controller" probeResult="failure" output=< Mar 07 07:12:05 crc kubenswrapper[4941]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 07 07:12:05 crc kubenswrapper[4941]: > Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.741173 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.776146 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-972xm" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.841587 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qhlc\" (UniqueName: \"kubernetes.io/projected/bf0adf58-bd20-433a-a80c-0a871ec201b4-kube-api-access-9qhlc\") pod \"bf0adf58-bd20-433a-a80c-0a871ec201b4\" (UID: \"bf0adf58-bd20-433a-a80c-0a871ec201b4\") " Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.841638 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0adf58-bd20-433a-a80c-0a871ec201b4-operator-scripts\") pod \"bf0adf58-bd20-433a-a80c-0a871ec201b4\" (UID: \"bf0adf58-bd20-433a-a80c-0a871ec201b4\") " Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.842849 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf0adf58-bd20-433a-a80c-0a871ec201b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf0adf58-bd20-433a-a80c-0a871ec201b4" (UID: "bf0adf58-bd20-433a-a80c-0a871ec201b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.849810 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf0adf58-bd20-433a-a80c-0a871ec201b4-kube-api-access-9qhlc" (OuterVolumeSpecName: "kube-api-access-9qhlc") pod "bf0adf58-bd20-433a-a80c-0a871ec201b4" (UID: "bf0adf58-bd20-433a-a80c-0a871ec201b4"). InnerVolumeSpecName "kube-api-access-9qhlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.943822 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qhlc\" (UniqueName: \"kubernetes.io/projected/bf0adf58-bd20-433a-a80c-0a871ec201b4-kube-api-access-9qhlc\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.943853 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0adf58-bd20-433a-a80c-0a871ec201b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.962839 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46ddec52-6876-4e01-975f-0c00387eba75" path="/var/lib/kubelet/pods/46ddec52-6876-4e01-975f-0c00387eba75/volumes" Mar 07 07:12:05 crc kubenswrapper[4941]: I0307 07:12:05.963636 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tz4v7"] Mar 07 07:12:06 crc kubenswrapper[4941]: I0307 07:12:06.376118 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tz4v7" event={"ID":"be73b1fb-9f01-4e2b-a4fa-7f004be742e3","Type":"ContainerStarted","Data":"6f3d3f0bb30de897df6b3beee6e1b4634700642f66d1e8aaed137e84860c9cf3"} Mar 07 07:12:06 crc kubenswrapper[4941]: I0307 07:12:06.383073 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-972xm" event={"ID":"bf0adf58-bd20-433a-a80c-0a871ec201b4","Type":"ContainerDied","Data":"5e866d739f9ee1e83b51594afba57ee3085fda28531d16199306ac3a12d46641"} Mar 07 07:12:06 crc kubenswrapper[4941]: I0307 07:12:06.383128 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e866d739f9ee1e83b51594afba57ee3085fda28531d16199306ac3a12d46641" Mar 07 07:12:06 crc kubenswrapper[4941]: I0307 07:12:06.383132 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-972xm" Mar 07 07:12:06 crc kubenswrapper[4941]: I0307 07:12:06.388058 4941 generic.go:334] "Generic (PLEG): container finished" podID="3963d293-d9e9-44b6-b0a5-b1532b4a0a31" containerID="c2e50e54d812cc28e53051fff65c9444476dee98164deae3a30d1ddc3f4e4e86" exitCode=0 Mar 07 07:12:06 crc kubenswrapper[4941]: I0307 07:12:06.388128 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3963d293-d9e9-44b6-b0a5-b1532b4a0a31","Type":"ContainerDied","Data":"c2e50e54d812cc28e53051fff65c9444476dee98164deae3a30d1ddc3f4e4e86"} Mar 07 07:12:06 crc kubenswrapper[4941]: I0307 07:12:06.392483 4941 generic.go:334] "Generic (PLEG): container finished" podID="aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" containerID="a12662163e378ef0047a7d0c3ffc76b2214269655c2741ec69ff5a13c078ddf4" exitCode=0 Mar 07 07:12:06 crc kubenswrapper[4941]: I0307 07:12:06.392571 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670","Type":"ContainerDied","Data":"a12662163e378ef0047a7d0c3ffc76b2214269655c2741ec69ff5a13c078ddf4"} Mar 07 07:12:06 crc kubenswrapper[4941]: I0307 07:12:06.399925 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"b6365958c0e82455e8fe1908fb727efdae94a402d88da1bf60bf89283c9d3a65"} Mar 07 07:12:06 crc kubenswrapper[4941]: I0307 07:12:06.399982 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"c71a131c16d858150432793989349614354710b468996aed0a90a0a3b4655d57"} Mar 07 07:12:06 crc kubenswrapper[4941]: I0307 07:12:06.400004 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"39628dd870570071b2bd7ae179bda95dfcba64ea2358713fb6ac97b67dd7f09c"} Mar 07 07:12:06 crc kubenswrapper[4941]: I0307 07:12:06.400022 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"b9f094df208cb97311346655039b847b135f62d9985a004099e08baeed2fee89"} Mar 07 07:12:07 crc kubenswrapper[4941]: I0307 07:12:07.411070 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3963d293-d9e9-44b6-b0a5-b1532b4a0a31","Type":"ContainerStarted","Data":"fb34c82f5b3e748bcf461b8c680f622fbd3d3b441013c66d1fb770a45f687c55"} Mar 07 07:12:07 crc kubenswrapper[4941]: I0307 07:12:07.411535 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 07 07:12:07 crc kubenswrapper[4941]: I0307 07:12:07.414854 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670","Type":"ContainerStarted","Data":"f7a8e765543e88a1c6e7d28463ec6d1148163252cc8cc4989b9a46a6cdfd7693"} Mar 07 07:12:07 crc kubenswrapper[4941]: I0307 07:12:07.415031 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:12:07 crc kubenswrapper[4941]: I0307 07:12:07.452899 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=48.553394912 podStartE2EDuration="57.452879721s" podCreationTimestamp="2026-03-07 07:11:10 +0000 UTC" firstStartedPulling="2026-03-07 07:11:22.804239713 +0000 UTC m=+1179.756605178" lastFinishedPulling="2026-03-07 07:11:31.703724522 +0000 UTC m=+1188.656089987" observedRunningTime="2026-03-07 07:12:07.431922402 +0000 UTC m=+1224.384287867" watchObservedRunningTime="2026-03-07 07:12:07.452879721 +0000 UTC m=+1224.405245186" Mar 07 07:12:07 crc kubenswrapper[4941]: I0307 07:12:07.459524 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.005732967 podStartE2EDuration="58.459503273s" podCreationTimestamp="2026-03-07 07:11:09 +0000 UTC" firstStartedPulling="2026-03-07 07:11:22.976825891 +0000 UTC m=+1179.929191356" lastFinishedPulling="2026-03-07 07:11:31.430596197 +0000 UTC m=+1188.382961662" observedRunningTime="2026-03-07 07:12:07.454959942 +0000 UTC m=+1224.407325407" watchObservedRunningTime="2026-03-07 07:12:07.459503273 +0000 UTC m=+1224.411868738" Mar 07 07:12:08 crc kubenswrapper[4941]: I0307 07:12:08.426011 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"76a0c0f3d60ae71295ad9a44c1ccf6e3b855bfb182c9bd98972a884b1b52f6f8"} Mar 07 07:12:08 crc kubenswrapper[4941]: I0307 07:12:08.426867 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"b2118b6239f5287819f8bf36551a0f5192fed176c1776aa201c973673b7cea6c"} Mar 07 07:12:08 crc kubenswrapper[4941]: I0307 07:12:08.426887 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"dde308e8b7e70259a3c26190f8e91532136f44b834cef40a52f1a0a3750a50ef"} Mar 07 07:12:09 crc kubenswrapper[4941]: I0307 07:12:09.435475 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"3a29871d30c27b2ebf022f9baa1deedf75b3ae8ee4831d1770e23d3e54a09010"} Mar 07 07:12:10 crc kubenswrapper[4941]: I0307 07:12:10.449249 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"94f7fa69e0150f7b0164f8a024d5ff0ff408147eb7732aa59d194581d6174384"} Mar 07 07:12:10 crc kubenswrapper[4941]: I0307 07:12:10.717493 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-x7fq9" podUID="5f4f0d58-e159-427f-8cca-95525d4968cd" containerName="ovn-controller" probeResult="failure" output=< Mar 07 07:12:10 crc kubenswrapper[4941]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 07 07:12:10 crc kubenswrapper[4941]: > Mar 07 07:12:11 crc kubenswrapper[4941]: I0307 07:12:11.461925 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"abc84bd5c148347fd3e6e9bae1b3e5e71c3cfbd0e165bf4c6c5476ff169250b7"} Mar 07 07:12:15 crc kubenswrapper[4941]: I0307 07:12:15.717757 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-x7fq9" podUID="5f4f0d58-e159-427f-8cca-95525d4968cd" containerName="ovn-controller" probeResult="failure" output=< Mar 07 07:12:15 crc kubenswrapper[4941]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 07 07:12:15 crc kubenswrapper[4941]: > Mar 07 07:12:15 crc kubenswrapper[4941]: I0307 07:12:15.804218 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:12:15 crc kubenswrapper[4941]: I0307 07:12:15.810752 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.129066 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-x7fq9-config-tcgfk"] Mar 07 07:12:16 crc kubenswrapper[4941]: E0307 07:12:16.129531 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf0adf58-bd20-433a-a80c-0a871ec201b4" containerName="mariadb-account-create-update" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.129557 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0adf58-bd20-433a-a80c-0a871ec201b4" containerName="mariadb-account-create-update" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.129808 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf0adf58-bd20-433a-a80c-0a871ec201b4" containerName="mariadb-account-create-update" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.130488 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.133976 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.138143 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x7fq9-config-tcgfk"] Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.240327 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-additional-scripts\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.240373 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-run\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.240447 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-run-ovn\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.240468 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-scripts\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.240491 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-log-ovn\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.240516 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk2fs\" (UniqueName: \"kubernetes.io/projected/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-kube-api-access-lk2fs\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.342151 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-run-ovn\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.342198 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-scripts\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.342225 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-log-ovn\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.342254 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk2fs\" (UniqueName: \"kubernetes.io/projected/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-kube-api-access-lk2fs\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.342324 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-additional-scripts\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.342345 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-run\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.342574 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-log-ovn\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.342600 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-run-ovn\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.342590 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-run\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.343198 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-additional-scripts\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.344530 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-scripts\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.368916 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk2fs\" (UniqueName: \"kubernetes.io/projected/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-kube-api-access-lk2fs\") pod \"ovn-controller-x7fq9-config-tcgfk\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:16 crc kubenswrapper[4941]: I0307 07:12:16.454681 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:18 crc kubenswrapper[4941]: I0307 07:12:18.166237 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x7fq9-config-tcgfk"] Mar 07 07:12:18 crc kubenswrapper[4941]: W0307 07:12:18.175584 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38a25c41_f0a7_480b_8fb1_c2920f0ad1e9.slice/crio-65c8e292bbacccff5725617dad04b809e679f9482477ba52dce55a65aa1d2f63 WatchSource:0}: Error finding container 65c8e292bbacccff5725617dad04b809e679f9482477ba52dce55a65aa1d2f63: Status 404 returned error can't find the container with id 65c8e292bbacccff5725617dad04b809e679f9482477ba52dce55a65aa1d2f63 Mar 07 07:12:18 crc kubenswrapper[4941]: I0307 07:12:18.522254 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7fq9-config-tcgfk" event={"ID":"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9","Type":"ContainerStarted","Data":"b92f5de33b659f93ae7984e5c8bec08d59a4376d29eec81d883e98db00bb0c26"} Mar 07 07:12:18 crc kubenswrapper[4941]: I0307 07:12:18.522710 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7fq9-config-tcgfk" event={"ID":"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9","Type":"ContainerStarted","Data":"65c8e292bbacccff5725617dad04b809e679f9482477ba52dce55a65aa1d2f63"} Mar 07 07:12:18 crc kubenswrapper[4941]: I0307 07:12:18.532776 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"85270965dc437741cea69bd912b61c8d09026d3d258607bf435fad06544d3fd6"} Mar 07 07:12:18 crc kubenswrapper[4941]: I0307 07:12:18.532814 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"855cf93f381f91e003458131c37485c03dad7138c9edead25076ccf72ecf7f52"} Mar 07 07:12:18 crc kubenswrapper[4941]: I0307 07:12:18.532824 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"56cd38a5b827100634031f80ea3f6b66bba0e5e1443ae05edcfcbe4fc643efaf"} Mar 07 07:12:18 crc kubenswrapper[4941]: I0307 07:12:18.532833 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"2e027b31903a84225494031f036fcf7247ad6301c5f5dde58c4ee1c14cce7c11"} Mar 07 07:12:18 crc kubenswrapper[4941]: I0307 07:12:18.534011 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tz4v7" event={"ID":"be73b1fb-9f01-4e2b-a4fa-7f004be742e3","Type":"ContainerStarted","Data":"869bc218596392f73c6fe9f035895dd8eea729b54c44ab37db7ecb0acfe66eae"} Mar 07 07:12:18 crc kubenswrapper[4941]: I0307 07:12:18.549249 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-x7fq9-config-tcgfk" podStartSLOduration=2.549229437 podStartE2EDuration="2.549229437s" podCreationTimestamp="2026-03-07 07:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:12:18.537602271 +0000 UTC m=+1235.489967736" watchObservedRunningTime="2026-03-07 07:12:18.549229437 +0000 UTC m=+1235.501594902" Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.545600 4941 generic.go:334] "Generic (PLEG): container finished" podID="38a25c41-f0a7-480b-8fb1-c2920f0ad1e9" containerID="b92f5de33b659f93ae7984e5c8bec08d59a4376d29eec81d883e98db00bb0c26" exitCode=0 Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.545736 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7fq9-config-tcgfk" event={"ID":"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9","Type":"ContainerDied","Data":"b92f5de33b659f93ae7984e5c8bec08d59a4376d29eec81d883e98db00bb0c26"} Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.561222 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerStarted","Data":"de3ca98cd25b474d3e353f72d65beb19ff275720c6069dd01c5f7583298062ed"} Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.571834 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-tz4v7" podStartSLOduration=2.786760233 podStartE2EDuration="14.57181215s" podCreationTimestamp="2026-03-07 07:12:05 +0000 UTC" firstStartedPulling="2026-03-07 07:12:05.965674243 +0000 UTC m=+1222.918039708" lastFinishedPulling="2026-03-07 07:12:17.75072616 +0000 UTC m=+1234.703091625" observedRunningTime="2026-03-07 07:12:18.556784138 +0000 UTC m=+1235.509149603" watchObservedRunningTime="2026-03-07 07:12:19.57181215 +0000 UTC m=+1236.524177615" Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.619228 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=28.193105378 podStartE2EDuration="33.619205253s" podCreationTimestamp="2026-03-07 07:11:46 +0000 UTC" firstStartedPulling="2026-03-07 07:12:04.291778514 +0000 UTC m=+1221.244143979" lastFinishedPulling="2026-03-07 07:12:09.717878389 +0000 UTC m=+1226.670243854" observedRunningTime="2026-03-07 07:12:19.612823701 +0000 UTC m=+1236.565189206" watchObservedRunningTime="2026-03-07 07:12:19.619205253 +0000 UTC m=+1236.571570728" Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.874511 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d74f8fb89-xvqwk"] Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.876826 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.879855 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.897032 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d74f8fb89-xvqwk"] Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.902618 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h22bl\" (UniqueName: \"kubernetes.io/projected/69eaf595-4875-440a-8b7f-b9dd8787c325-kube-api-access-h22bl\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.902678 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-ovsdbserver-sb\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.902724 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-config\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.902761 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-ovsdbserver-nb\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.902816 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-dns-svc\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:19 crc kubenswrapper[4941]: I0307 07:12:19.902852 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-dns-swift-storage-0\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.004122 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-dns-swift-storage-0\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.004259 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h22bl\" (UniqueName: \"kubernetes.io/projected/69eaf595-4875-440a-8b7f-b9dd8787c325-kube-api-access-h22bl\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.004283 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-ovsdbserver-sb\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.004312 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-config\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.004336 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-ovsdbserver-nb\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.004372 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-dns-svc\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.005086 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-dns-swift-storage-0\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.005094 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-dns-svc\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.005652 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-ovsdbserver-sb\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.006223 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-config\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.006828 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-ovsdbserver-nb\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.022193 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h22bl\" (UniqueName: \"kubernetes.io/projected/69eaf595-4875-440a-8b7f-b9dd8787c325-kube-api-access-h22bl\") pod \"dnsmasq-dns-6d74f8fb89-xvqwk\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.195445 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.640998 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d74f8fb89-xvqwk"] Mar 07 07:12:20 crc kubenswrapper[4941]: W0307 07:12:20.643793 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69eaf595_4875_440a_8b7f_b9dd8787c325.slice/crio-3e8d6e45416e833d7f6755e9a565b39306ce55eaf003c837f502ac88d388423f WatchSource:0}: Error finding container 3e8d6e45416e833d7f6755e9a565b39306ce55eaf003c837f502ac88d388423f: Status 404 returned error can't find the container with id 3e8d6e45416e833d7f6755e9a565b39306ce55eaf003c837f502ac88d388423f Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.744743 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-x7fq9" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.826950 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.920008 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-scripts\") pod \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.920094 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-additional-scripts\") pod \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.920143 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-log-ovn\") pod \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.920221 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-run-ovn\") pod \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.920261 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk2fs\" (UniqueName: \"kubernetes.io/projected/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-kube-api-access-lk2fs\") pod \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.920275 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "38a25c41-f0a7-480b-8fb1-c2920f0ad1e9" (UID: "38a25c41-f0a7-480b-8fb1-c2920f0ad1e9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.920454 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-run\") pod \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\" (UID: \"38a25c41-f0a7-480b-8fb1-c2920f0ad1e9\") " Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.920489 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-run" (OuterVolumeSpecName: "var-run") pod "38a25c41-f0a7-480b-8fb1-c2920f0ad1e9" (UID: "38a25c41-f0a7-480b-8fb1-c2920f0ad1e9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.920924 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "38a25c41-f0a7-480b-8fb1-c2920f0ad1e9" (UID: "38a25c41-f0a7-480b-8fb1-c2920f0ad1e9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.921008 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "38a25c41-f0a7-480b-8fb1-c2920f0ad1e9" (UID: "38a25c41-f0a7-480b-8fb1-c2920f0ad1e9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.921055 4941 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-run\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.921081 4941 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.921494 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-scripts" (OuterVolumeSpecName: "scripts") pod "38a25c41-f0a7-480b-8fb1-c2920f0ad1e9" (UID: "38a25c41-f0a7-480b-8fb1-c2920f0ad1e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:20 crc kubenswrapper[4941]: I0307 07:12:20.923805 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-kube-api-access-lk2fs" (OuterVolumeSpecName: "kube-api-access-lk2fs") pod "38a25c41-f0a7-480b-8fb1-c2920f0ad1e9" (UID: "38a25c41-f0a7-480b-8fb1-c2920f0ad1e9"). InnerVolumeSpecName "kube-api-access-lk2fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.024845 4941 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.024893 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk2fs\" (UniqueName: \"kubernetes.io/projected/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-kube-api-access-lk2fs\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.024917 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.024935 4941 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.243214 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-x7fq9-config-tcgfk"] Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.250964 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-x7fq9-config-tcgfk"] Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.336377 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-x7fq9-config-7v7wl"] Mar 07 07:12:21 crc kubenswrapper[4941]: E0307 07:12:21.336747 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a25c41-f0a7-480b-8fb1-c2920f0ad1e9" containerName="ovn-config" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.336768 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a25c41-f0a7-480b-8fb1-c2920f0ad1e9" containerName="ovn-config" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.336943 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a25c41-f0a7-480b-8fb1-c2920f0ad1e9" containerName="ovn-config" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.337630 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.350064 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x7fq9-config-7v7wl"] Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.368328 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.431480 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcs2d\" (UniqueName: \"kubernetes.io/projected/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-kube-api-access-rcs2d\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.431535 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-log-ovn\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.431601 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-run-ovn\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.431635 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-run\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.431698 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-additional-scripts\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.431727 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-scripts\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.533147 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcs2d\" (UniqueName: \"kubernetes.io/projected/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-kube-api-access-rcs2d\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.533202 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-log-ovn\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.533267 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-run-ovn\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.533294 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-run\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.533354 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-additional-scripts\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.533373 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-scripts\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.533563 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-log-ovn\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.533625 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-run\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.533930 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-run-ovn\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.534692 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-additional-scripts\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.535897 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-scripts\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.557504 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcs2d\" (UniqueName: \"kubernetes.io/projected/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-kube-api-access-rcs2d\") pod \"ovn-controller-x7fq9-config-7v7wl\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.587025 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65c8e292bbacccff5725617dad04b809e679f9482477ba52dce55a65aa1d2f63" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.587035 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7fq9-config-tcgfk" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.588826 4941 generic.go:334] "Generic (PLEG): container finished" podID="69eaf595-4875-440a-8b7f-b9dd8787c325" containerID="2e20a5e1f6ab7a40805e0646f13c4e25de495dbd25d24f8b5ca7b57c07fb052a" exitCode=0 Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.588871 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" event={"ID":"69eaf595-4875-440a-8b7f-b9dd8787c325","Type":"ContainerDied","Data":"2e20a5e1f6ab7a40805e0646f13c4e25de495dbd25d24f8b5ca7b57c07fb052a"} Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.588901 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" event={"ID":"69eaf595-4875-440a-8b7f-b9dd8787c325","Type":"ContainerStarted","Data":"3e8d6e45416e833d7f6755e9a565b39306ce55eaf003c837f502ac88d388423f"} Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.657383 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.707614 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 07 07:12:21 crc kubenswrapper[4941]: I0307 07:12:21.965289 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a25c41-f0a7-480b-8fb1-c2920f0ad1e9" path="/var/lib/kubelet/pods/38a25c41-f0a7-480b-8fb1-c2920f0ad1e9/volumes" Mar 07 07:12:22 crc kubenswrapper[4941]: I0307 07:12:22.121016 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x7fq9-config-7v7wl"] Mar 07 07:12:22 crc kubenswrapper[4941]: W0307 07:12:22.126769 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bdb1f62_f7ff_4d83_9428_7fbad7d54ab3.slice/crio-8d9c313ae118b60d2fdf122a929ed38646f1b16fc373ddb207e6fa0bccf80b62 WatchSource:0}: Error finding container 8d9c313ae118b60d2fdf122a929ed38646f1b16fc373ddb207e6fa0bccf80b62: Status 404 returned error can't find the container with id 8d9c313ae118b60d2fdf122a929ed38646f1b16fc373ddb207e6fa0bccf80b62 Mar 07 07:12:22 crc kubenswrapper[4941]: I0307 07:12:22.597155 4941 generic.go:334] "Generic (PLEG): container finished" podID="0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3" containerID="771a6acb79466846fbbda26c235f8a9856350aa97ba28e19f837ce95c717bb48" exitCode=0 Mar 07 07:12:22 crc kubenswrapper[4941]: I0307 07:12:22.597313 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7fq9-config-7v7wl" event={"ID":"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3","Type":"ContainerDied","Data":"771a6acb79466846fbbda26c235f8a9856350aa97ba28e19f837ce95c717bb48"} Mar 07 07:12:22 crc kubenswrapper[4941]: I0307 07:12:22.597503 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7fq9-config-7v7wl" event={"ID":"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3","Type":"ContainerStarted","Data":"8d9c313ae118b60d2fdf122a929ed38646f1b16fc373ddb207e6fa0bccf80b62"} Mar 07 07:12:22 crc kubenswrapper[4941]: I0307 07:12:22.599425 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" event={"ID":"69eaf595-4875-440a-8b7f-b9dd8787c325","Type":"ContainerStarted","Data":"f860858c174a197ff23e9f8272e738e7612917a538c17b2314121259cb063e02"} Mar 07 07:12:22 crc kubenswrapper[4941]: I0307 07:12:22.599582 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:22 crc kubenswrapper[4941]: I0307 07:12:22.634056 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" podStartSLOduration=3.634038162 podStartE2EDuration="3.634038162s" podCreationTimestamp="2026-03-07 07:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:12:22.632748569 +0000 UTC m=+1239.585114034" watchObservedRunningTime="2026-03-07 07:12:22.634038162 +0000 UTC m=+1239.586403627" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.304508 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-4bnlh"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.306193 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4bnlh" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.311702 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4bnlh"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.416047 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-1815-account-create-update-hdrpv"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.417426 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1815-account-create-update-hdrpv" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.419802 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.431678 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1815-account-create-update-hdrpv"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.464306 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a4dd8e-f8bf-4695-8883-da720a6e1efd-operator-scripts\") pod \"cinder-db-create-4bnlh\" (UID: \"27a4dd8e-f8bf-4695-8883-da720a6e1efd\") " pod="openstack/cinder-db-create-4bnlh" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.464480 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qrzp\" (UniqueName: \"kubernetes.io/projected/27a4dd8e-f8bf-4695-8883-da720a6e1efd-kube-api-access-6qrzp\") pod \"cinder-db-create-4bnlh\" (UID: \"27a4dd8e-f8bf-4695-8883-da720a6e1efd\") " pod="openstack/cinder-db-create-4bnlh" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.503797 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-mhtvx"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.504734 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mhtvx" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.516433 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b246-account-create-update-c975h"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.517540 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b246-account-create-update-c975h" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.519476 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.530486 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-mhtvx"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.540357 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b246-account-create-update-c975h"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.573607 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxnnc\" (UniqueName: \"kubernetes.io/projected/dc655c4d-3dd7-40c6-85c9-d53daedf8a65-kube-api-access-jxnnc\") pod \"cinder-1815-account-create-update-hdrpv\" (UID: \"dc655c4d-3dd7-40c6-85c9-d53daedf8a65\") " pod="openstack/cinder-1815-account-create-update-hdrpv" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.573882 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc655c4d-3dd7-40c6-85c9-d53daedf8a65-operator-scripts\") pod \"cinder-1815-account-create-update-hdrpv\" (UID: \"dc655c4d-3dd7-40c6-85c9-d53daedf8a65\") " pod="openstack/cinder-1815-account-create-update-hdrpv" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.574020 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qrzp\" (UniqueName: \"kubernetes.io/projected/27a4dd8e-f8bf-4695-8883-da720a6e1efd-kube-api-access-6qrzp\") pod \"cinder-db-create-4bnlh\" (UID: \"27a4dd8e-f8bf-4695-8883-da720a6e1efd\") " pod="openstack/cinder-db-create-4bnlh" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.574157 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a4dd8e-f8bf-4695-8883-da720a6e1efd-operator-scripts\") pod \"cinder-db-create-4bnlh\" (UID: \"27a4dd8e-f8bf-4695-8883-da720a6e1efd\") " pod="openstack/cinder-db-create-4bnlh" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.574999 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a4dd8e-f8bf-4695-8883-da720a6e1efd-operator-scripts\") pod \"cinder-db-create-4bnlh\" (UID: \"27a4dd8e-f8bf-4695-8883-da720a6e1efd\") " pod="openstack/cinder-db-create-4bnlh" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.606038 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qrzp\" (UniqueName: \"kubernetes.io/projected/27a4dd8e-f8bf-4695-8883-da720a6e1efd-kube-api-access-6qrzp\") pod \"cinder-db-create-4bnlh\" (UID: \"27a4dd8e-f8bf-4695-8883-da720a6e1efd\") " pod="openstack/cinder-db-create-4bnlh" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.610792 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-lbdr9"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.612854 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lbdr9" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.617208 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lf9xv" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.617371 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.617472 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.619663 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.620353 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4bnlh" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.623834 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lbdr9"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.636419 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rgx2z"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.643334 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rgx2z"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.643639 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rgx2z" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.675900 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxnnc\" (UniqueName: \"kubernetes.io/projected/dc655c4d-3dd7-40c6-85c9-d53daedf8a65-kube-api-access-jxnnc\") pod \"cinder-1815-account-create-update-hdrpv\" (UID: \"dc655c4d-3dd7-40c6-85c9-d53daedf8a65\") " pod="openstack/cinder-1815-account-create-update-hdrpv" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.675953 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc655c4d-3dd7-40c6-85c9-d53daedf8a65-operator-scripts\") pod \"cinder-1815-account-create-update-hdrpv\" (UID: \"dc655c4d-3dd7-40c6-85c9-d53daedf8a65\") " pod="openstack/cinder-1815-account-create-update-hdrpv" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.675997 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wql69\" (UniqueName: \"kubernetes.io/projected/25419741-acb3-497c-b0cf-c2bf78d58bd1-kube-api-access-wql69\") pod \"barbican-db-create-mhtvx\" (UID: \"25419741-acb3-497c-b0cf-c2bf78d58bd1\") " pod="openstack/barbican-db-create-mhtvx" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.676041 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfwbq\" (UniqueName: \"kubernetes.io/projected/9da0c545-5faf-43e4-afbb-f016c457a9e0-kube-api-access-cfwbq\") pod \"barbican-b246-account-create-update-c975h\" (UID: \"9da0c545-5faf-43e4-afbb-f016c457a9e0\") " pod="openstack/barbican-b246-account-create-update-c975h" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.676062 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9da0c545-5faf-43e4-afbb-f016c457a9e0-operator-scripts\") pod \"barbican-b246-account-create-update-c975h\" (UID: \"9da0c545-5faf-43e4-afbb-f016c457a9e0\") " pod="openstack/barbican-b246-account-create-update-c975h" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.676095 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25419741-acb3-497c-b0cf-c2bf78d58bd1-operator-scripts\") pod \"barbican-db-create-mhtvx\" (UID: \"25419741-acb3-497c-b0cf-c2bf78d58bd1\") " pod="openstack/barbican-db-create-mhtvx" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.677228 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc655c4d-3dd7-40c6-85c9-d53daedf8a65-operator-scripts\") pod \"cinder-1815-account-create-update-hdrpv\" (UID: \"dc655c4d-3dd7-40c6-85c9-d53daedf8a65\") " pod="openstack/cinder-1815-account-create-update-hdrpv" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.704727 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxnnc\" (UniqueName: \"kubernetes.io/projected/dc655c4d-3dd7-40c6-85c9-d53daedf8a65-kube-api-access-jxnnc\") pod \"cinder-1815-account-create-update-hdrpv\" (UID: \"dc655c4d-3dd7-40c6-85c9-d53daedf8a65\") " pod="openstack/cinder-1815-account-create-update-hdrpv" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.732252 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1815-account-create-update-hdrpv" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.739360 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1a10-account-create-update-mk8sb"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.742209 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a10-account-create-update-mk8sb" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.744380 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.760730 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1a10-account-create-update-mk8sb"] Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.779994 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwbq\" (UniqueName: \"kubernetes.io/projected/9da0c545-5faf-43e4-afbb-f016c457a9e0-kube-api-access-cfwbq\") pod \"barbican-b246-account-create-update-c975h\" (UID: \"9da0c545-5faf-43e4-afbb-f016c457a9e0\") " pod="openstack/barbican-b246-account-create-update-c975h" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.780045 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9da0c545-5faf-43e4-afbb-f016c457a9e0-operator-scripts\") pod \"barbican-b246-account-create-update-c975h\" (UID: \"9da0c545-5faf-43e4-afbb-f016c457a9e0\") " pod="openstack/barbican-b246-account-create-update-c975h" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.780101 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25419741-acb3-497c-b0cf-c2bf78d58bd1-operator-scripts\") pod \"barbican-db-create-mhtvx\" (UID: \"25419741-acb3-497c-b0cf-c2bf78d58bd1\") " pod="openstack/barbican-db-create-mhtvx" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.780177 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csztr\" (UniqueName: \"kubernetes.io/projected/e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f-kube-api-access-csztr\") pod \"neutron-db-create-rgx2z\" (UID: \"e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f\") " pod="openstack/neutron-db-create-rgx2z" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.780224 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znv47\" (UniqueName: \"kubernetes.io/projected/2a1b17cf-5cc0-4c89-8757-7cc78a79a94f-kube-api-access-znv47\") pod \"neutron-1a10-account-create-update-mk8sb\" (UID: \"2a1b17cf-5cc0-4c89-8757-7cc78a79a94f\") " pod="openstack/neutron-1a10-account-create-update-mk8sb" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.780256 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh8n7\" (UniqueName: \"kubernetes.io/projected/8819def4-42df-4a8f-b5d0-21db1e1ca87a-kube-api-access-gh8n7\") pod \"keystone-db-sync-lbdr9\" (UID: \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\") " pod="openstack/keystone-db-sync-lbdr9" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.780319 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b17cf-5cc0-4c89-8757-7cc78a79a94f-operator-scripts\") pod \"neutron-1a10-account-create-update-mk8sb\" (UID: \"2a1b17cf-5cc0-4c89-8757-7cc78a79a94f\") " pod="openstack/neutron-1a10-account-create-update-mk8sb" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.780341 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8819def4-42df-4a8f-b5d0-21db1e1ca87a-config-data\") pod \"keystone-db-sync-lbdr9\" (UID: \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\") " pod="openstack/keystone-db-sync-lbdr9" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.780448 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8819def4-42df-4a8f-b5d0-21db1e1ca87a-combined-ca-bundle\") pod \"keystone-db-sync-lbdr9\" (UID: \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\") " pod="openstack/keystone-db-sync-lbdr9" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.780489 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wql69\" (UniqueName: \"kubernetes.io/projected/25419741-acb3-497c-b0cf-c2bf78d58bd1-kube-api-access-wql69\") pod \"barbican-db-create-mhtvx\" (UID: \"25419741-acb3-497c-b0cf-c2bf78d58bd1\") " pod="openstack/barbican-db-create-mhtvx" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.780517 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f-operator-scripts\") pod \"neutron-db-create-rgx2z\" (UID: \"e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f\") " pod="openstack/neutron-db-create-rgx2z" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.781471 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9da0c545-5faf-43e4-afbb-f016c457a9e0-operator-scripts\") pod \"barbican-b246-account-create-update-c975h\" (UID: \"9da0c545-5faf-43e4-afbb-f016c457a9e0\") " pod="openstack/barbican-b246-account-create-update-c975h" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.782148 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25419741-acb3-497c-b0cf-c2bf78d58bd1-operator-scripts\") pod \"barbican-db-create-mhtvx\" (UID: \"25419741-acb3-497c-b0cf-c2bf78d58bd1\") " pod="openstack/barbican-db-create-mhtvx" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.806909 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wql69\" (UniqueName: \"kubernetes.io/projected/25419741-acb3-497c-b0cf-c2bf78d58bd1-kube-api-access-wql69\") pod \"barbican-db-create-mhtvx\" (UID: \"25419741-acb3-497c-b0cf-c2bf78d58bd1\") " pod="openstack/barbican-db-create-mhtvx" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.811208 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfwbq\" (UniqueName: \"kubernetes.io/projected/9da0c545-5faf-43e4-afbb-f016c457a9e0-kube-api-access-cfwbq\") pod \"barbican-b246-account-create-update-c975h\" (UID: \"9da0c545-5faf-43e4-afbb-f016c457a9e0\") " pod="openstack/barbican-b246-account-create-update-c975h" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.818859 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mhtvx" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.831237 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b246-account-create-update-c975h" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.882487 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8819def4-42df-4a8f-b5d0-21db1e1ca87a-combined-ca-bundle\") pod \"keystone-db-sync-lbdr9\" (UID: \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\") " pod="openstack/keystone-db-sync-lbdr9" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.882746 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f-operator-scripts\") pod \"neutron-db-create-rgx2z\" (UID: \"e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f\") " pod="openstack/neutron-db-create-rgx2z" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.882836 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csztr\" (UniqueName: \"kubernetes.io/projected/e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f-kube-api-access-csztr\") pod \"neutron-db-create-rgx2z\" (UID: \"e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f\") " pod="openstack/neutron-db-create-rgx2z" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.882861 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znv47\" (UniqueName: \"kubernetes.io/projected/2a1b17cf-5cc0-4c89-8757-7cc78a79a94f-kube-api-access-znv47\") pod \"neutron-1a10-account-create-update-mk8sb\" (UID: \"2a1b17cf-5cc0-4c89-8757-7cc78a79a94f\") " pod="openstack/neutron-1a10-account-create-update-mk8sb" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.882884 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh8n7\" (UniqueName: \"kubernetes.io/projected/8819def4-42df-4a8f-b5d0-21db1e1ca87a-kube-api-access-gh8n7\") pod \"keystone-db-sync-lbdr9\" (UID: \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\") " pod="openstack/keystone-db-sync-lbdr9" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.882917 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b17cf-5cc0-4c89-8757-7cc78a79a94f-operator-scripts\") pod \"neutron-1a10-account-create-update-mk8sb\" (UID: \"2a1b17cf-5cc0-4c89-8757-7cc78a79a94f\") " pod="openstack/neutron-1a10-account-create-update-mk8sb" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.882935 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8819def4-42df-4a8f-b5d0-21db1e1ca87a-config-data\") pod \"keystone-db-sync-lbdr9\" (UID: \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\") " pod="openstack/keystone-db-sync-lbdr9" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.885303 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b17cf-5cc0-4c89-8757-7cc78a79a94f-operator-scripts\") pod \"neutron-1a10-account-create-update-mk8sb\" (UID: \"2a1b17cf-5cc0-4c89-8757-7cc78a79a94f\") " pod="openstack/neutron-1a10-account-create-update-mk8sb" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.888008 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f-operator-scripts\") pod \"neutron-db-create-rgx2z\" (UID: \"e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f\") " pod="openstack/neutron-db-create-rgx2z" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.890219 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8819def4-42df-4a8f-b5d0-21db1e1ca87a-combined-ca-bundle\") pod \"keystone-db-sync-lbdr9\" (UID: \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\") " pod="openstack/keystone-db-sync-lbdr9" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.895857 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8819def4-42df-4a8f-b5d0-21db1e1ca87a-config-data\") pod \"keystone-db-sync-lbdr9\" (UID: \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\") " pod="openstack/keystone-db-sync-lbdr9" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.918920 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csztr\" (UniqueName: \"kubernetes.io/projected/e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f-kube-api-access-csztr\") pod \"neutron-db-create-rgx2z\" (UID: \"e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f\") " pod="openstack/neutron-db-create-rgx2z" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.923264 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh8n7\" (UniqueName: \"kubernetes.io/projected/8819def4-42df-4a8f-b5d0-21db1e1ca87a-kube-api-access-gh8n7\") pod \"keystone-db-sync-lbdr9\" (UID: \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\") " pod="openstack/keystone-db-sync-lbdr9" Mar 07 07:12:23 crc kubenswrapper[4941]: I0307 07:12:23.923774 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znv47\" (UniqueName: \"kubernetes.io/projected/2a1b17cf-5cc0-4c89-8757-7cc78a79a94f-kube-api-access-znv47\") pod \"neutron-1a10-account-create-update-mk8sb\" (UID: \"2a1b17cf-5cc0-4c89-8757-7cc78a79a94f\") " pod="openstack/neutron-1a10-account-create-update-mk8sb" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.003120 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.139278 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lbdr9" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.173101 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rgx2z" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.183930 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a10-account-create-update-mk8sb" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.187792 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-run-ovn\") pod \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.187854 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcs2d\" (UniqueName: \"kubernetes.io/projected/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-kube-api-access-rcs2d\") pod \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.187949 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-additional-scripts\") pod \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.188014 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-scripts\") pod \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.188076 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-run\") pod \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.188095 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-log-ovn\") pod \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\" (UID: \"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3\") " Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.188285 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3" (UID: "0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.188381 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3" (UID: "0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.188517 4941 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.188528 4941 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.188855 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-run" (OuterVolumeSpecName: "var-run") pod "0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3" (UID: "0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.189084 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3" (UID: "0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.189587 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-scripts" (OuterVolumeSpecName: "scripts") pod "0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3" (UID: "0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.192795 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-kube-api-access-rcs2d" (OuterVolumeSpecName: "kube-api-access-rcs2d") pod "0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3" (UID: "0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3"). InnerVolumeSpecName "kube-api-access-rcs2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.247760 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4bnlh"] Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.299917 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcs2d\" (UniqueName: \"kubernetes.io/projected/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-kube-api-access-rcs2d\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.299953 4941 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.299963 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.299974 4941 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3-var-run\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.375962 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1815-account-create-update-hdrpv"] Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.470685 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b246-account-create-update-c975h"] Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.480502 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-mhtvx"] Mar 07 07:12:24 crc kubenswrapper[4941]: W0307 07:12:24.485688 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9da0c545_5faf_43e4_afbb_f016c457a9e0.slice/crio-2388a507cf94f24318a55c31cd282a95d6f52584448dd061ba441d49a968c708 WatchSource:0}: Error finding container 2388a507cf94f24318a55c31cd282a95d6f52584448dd061ba441d49a968c708: Status 404 returned error can't find the container with id 2388a507cf94f24318a55c31cd282a95d6f52584448dd061ba441d49a968c708 Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.643991 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7fq9-config-7v7wl" event={"ID":"0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3","Type":"ContainerDied","Data":"8d9c313ae118b60d2fdf122a929ed38646f1b16fc373ddb207e6fa0bccf80b62"} Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.644033 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d9c313ae118b60d2fdf122a929ed38646f1b16fc373ddb207e6fa0bccf80b62" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.644010 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7fq9-config-7v7wl" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.645835 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-mhtvx" event={"ID":"25419741-acb3-497c-b0cf-c2bf78d58bd1","Type":"ContainerStarted","Data":"29138e79c853da65bd39aef90dfdb322c7371465f46012c2d5f7f7393c9bca1e"} Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.648700 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b246-account-create-update-c975h" event={"ID":"9da0c545-5faf-43e4-afbb-f016c457a9e0","Type":"ContainerStarted","Data":"2388a507cf94f24318a55c31cd282a95d6f52584448dd061ba441d49a968c708"} Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.650187 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4bnlh" event={"ID":"27a4dd8e-f8bf-4695-8883-da720a6e1efd","Type":"ContainerStarted","Data":"db932bd212b934d5d0602e6a0bbf134bb38c6be7b68eb9e1378b335a513b1db6"} Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.650227 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4bnlh" event={"ID":"27a4dd8e-f8bf-4695-8883-da720a6e1efd","Type":"ContainerStarted","Data":"b0e306f0430f278fffadd34110bd43b7809f000012eb4cd31958a378962c985d"} Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.656172 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1815-account-create-update-hdrpv" event={"ID":"dc655c4d-3dd7-40c6-85c9-d53daedf8a65","Type":"ContainerStarted","Data":"c7b5da34db23d169d9edebd23dd84191643d56cd40b78fafb5704b26e70e540c"} Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.656233 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1815-account-create-update-hdrpv" event={"ID":"dc655c4d-3dd7-40c6-85c9-d53daedf8a65","Type":"ContainerStarted","Data":"c1d47bb0cf520182595206d828f73b1a50a57cae6398dde4f3342f4cea6bf0d9"} Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.681917 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-4bnlh" podStartSLOduration=1.681894869 podStartE2EDuration="1.681894869s" podCreationTimestamp="2026-03-07 07:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:12:24.662872216 +0000 UTC m=+1241.615237681" watchObservedRunningTime="2026-03-07 07:12:24.681894869 +0000 UTC m=+1241.634260334" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.697875 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.707260 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lbdr9"] Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.714732 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-1815-account-create-update-hdrpv" podStartSLOduration=1.714682851 podStartE2EDuration="1.714682851s" podCreationTimestamp="2026-03-07 07:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:12:24.684767042 +0000 UTC m=+1241.637132507" watchObservedRunningTime="2026-03-07 07:12:24.714682851 +0000 UTC m=+1241.667048316" Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.764656 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1a10-account-create-update-mk8sb"] Mar 07 07:12:24 crc kubenswrapper[4941]: I0307 07:12:24.787946 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rgx2z"] Mar 07 07:12:24 crc kubenswrapper[4941]: W0307 07:12:24.808733 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a1b17cf_5cc0_4c89_8757_7cc78a79a94f.slice/crio-9a0519b9a5822194247e1e78e18354e8ff3915730de6954f9693e2632c8a9830 WatchSource:0}: Error finding container 9a0519b9a5822194247e1e78e18354e8ff3915730de6954f9693e2632c8a9830: Status 404 returned error can't find the container with id 9a0519b9a5822194247e1e78e18354e8ff3915730de6954f9693e2632c8a9830 Mar 07 07:12:24 crc kubenswrapper[4941]: W0307 07:12:24.809277 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode96bc9b1_d1eb_4f10_a9df_9bcbf43d897f.slice/crio-cc281062d9df3cc9b7df49697eb82995146cfd1d8e5d7a470ec26ec59d1e5504 WatchSource:0}: Error finding container cc281062d9df3cc9b7df49697eb82995146cfd1d8e5d7a470ec26ec59d1e5504: Status 404 returned error can't find the container with id cc281062d9df3cc9b7df49697eb82995146cfd1d8e5d7a470ec26ec59d1e5504 Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.110343 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-x7fq9-config-7v7wl"] Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.122716 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-x7fq9-config-7v7wl"] Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.667330 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lbdr9" event={"ID":"8819def4-42df-4a8f-b5d0-21db1e1ca87a","Type":"ContainerStarted","Data":"0e8e044df5054e44d861059cc63f8078f9ce8f91ae87f91d028f0b23a98d0910"} Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.671274 4941 generic.go:334] "Generic (PLEG): container finished" podID="dc655c4d-3dd7-40c6-85c9-d53daedf8a65" containerID="c7b5da34db23d169d9edebd23dd84191643d56cd40b78fafb5704b26e70e540c" exitCode=0 Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.671348 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1815-account-create-update-hdrpv" event={"ID":"dc655c4d-3dd7-40c6-85c9-d53daedf8a65","Type":"ContainerDied","Data":"c7b5da34db23d169d9edebd23dd84191643d56cd40b78fafb5704b26e70e540c"} Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.675683 4941 generic.go:334] "Generic (PLEG): container finished" podID="2a1b17cf-5cc0-4c89-8757-7cc78a79a94f" containerID="240c2c14f09122e45dabd994d557fd833ae18604a37a74ced8f0772544c52251" exitCode=0 Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.675811 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1a10-account-create-update-mk8sb" event={"ID":"2a1b17cf-5cc0-4c89-8757-7cc78a79a94f","Type":"ContainerDied","Data":"240c2c14f09122e45dabd994d557fd833ae18604a37a74ced8f0772544c52251"} Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.675833 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1a10-account-create-update-mk8sb" event={"ID":"2a1b17cf-5cc0-4c89-8757-7cc78a79a94f","Type":"ContainerStarted","Data":"9a0519b9a5822194247e1e78e18354e8ff3915730de6954f9693e2632c8a9830"} Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.678689 4941 generic.go:334] "Generic (PLEG): container finished" podID="25419741-acb3-497c-b0cf-c2bf78d58bd1" containerID="c1f4de81631f63d5d8bbd5141740771337f23ab59c7dbb591d707c033658d5ad" exitCode=0 Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.678774 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-mhtvx" event={"ID":"25419741-acb3-497c-b0cf-c2bf78d58bd1","Type":"ContainerDied","Data":"c1f4de81631f63d5d8bbd5141740771337f23ab59c7dbb591d707c033658d5ad"} Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.682362 4941 generic.go:334] "Generic (PLEG): container finished" podID="e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f" containerID="242786fcb2ee6c81ed2471133631002d89ae5e5600992100b73f86fed708be4c" exitCode=0 Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.682481 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rgx2z" event={"ID":"e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f","Type":"ContainerDied","Data":"242786fcb2ee6c81ed2471133631002d89ae5e5600992100b73f86fed708be4c"} Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.682500 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rgx2z" event={"ID":"e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f","Type":"ContainerStarted","Data":"cc281062d9df3cc9b7df49697eb82995146cfd1d8e5d7a470ec26ec59d1e5504"} Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.690797 4941 generic.go:334] "Generic (PLEG): container finished" podID="9da0c545-5faf-43e4-afbb-f016c457a9e0" containerID="b204fbe9ce1cfc32ef2454ccbe3f384c2a0a936bc042d09ea6997fe637820ed5" exitCode=0 Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.690874 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b246-account-create-update-c975h" event={"ID":"9da0c545-5faf-43e4-afbb-f016c457a9e0","Type":"ContainerDied","Data":"b204fbe9ce1cfc32ef2454ccbe3f384c2a0a936bc042d09ea6997fe637820ed5"} Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.694187 4941 generic.go:334] "Generic (PLEG): container finished" podID="27a4dd8e-f8bf-4695-8883-da720a6e1efd" containerID="db932bd212b934d5d0602e6a0bbf134bb38c6be7b68eb9e1378b335a513b1db6" exitCode=0 Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.694228 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4bnlh" event={"ID":"27a4dd8e-f8bf-4695-8883-da720a6e1efd","Type":"ContainerDied","Data":"db932bd212b934d5d0602e6a0bbf134bb38c6be7b68eb9e1378b335a513b1db6"} Mar 07 07:12:25 crc kubenswrapper[4941]: I0307 07:12:25.965287 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3" path="/var/lib/kubelet/pods/0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3/volumes" Mar 07 07:12:26 crc kubenswrapper[4941]: I0307 07:12:26.702812 4941 generic.go:334] "Generic (PLEG): container finished" podID="be73b1fb-9f01-4e2b-a4fa-7f004be742e3" containerID="869bc218596392f73c6fe9f035895dd8eea729b54c44ab37db7ecb0acfe66eae" exitCode=0 Mar 07 07:12:26 crc kubenswrapper[4941]: I0307 07:12:26.702980 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tz4v7" event={"ID":"be73b1fb-9f01-4e2b-a4fa-7f004be742e3","Type":"ContainerDied","Data":"869bc218596392f73c6fe9f035895dd8eea729b54c44ab37db7ecb0acfe66eae"} Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.063820 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b246-account-create-update-c975h" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.071287 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a10-account-create-update-mk8sb" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.092666 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rgx2z" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.108391 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4bnlh" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.110942 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1815-account-create-update-hdrpv" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.114088 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mhtvx" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.145079 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.218828 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9da0c545-5faf-43e4-afbb-f016c457a9e0-operator-scripts\") pod \"9da0c545-5faf-43e4-afbb-f016c457a9e0\" (UID: \"9da0c545-5faf-43e4-afbb-f016c457a9e0\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.218903 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wql69\" (UniqueName: \"kubernetes.io/projected/25419741-acb3-497c-b0cf-c2bf78d58bd1-kube-api-access-wql69\") pod \"25419741-acb3-497c-b0cf-c2bf78d58bd1\" (UID: \"25419741-acb3-497c-b0cf-c2bf78d58bd1\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.218944 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfwbq\" (UniqueName: \"kubernetes.io/projected/9da0c545-5faf-43e4-afbb-f016c457a9e0-kube-api-access-cfwbq\") pod \"9da0c545-5faf-43e4-afbb-f016c457a9e0\" (UID: \"9da0c545-5faf-43e4-afbb-f016c457a9e0\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.218969 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a4dd8e-f8bf-4695-8883-da720a6e1efd-operator-scripts\") pod \"27a4dd8e-f8bf-4695-8883-da720a6e1efd\" (UID: \"27a4dd8e-f8bf-4695-8883-da720a6e1efd\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.219013 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b17cf-5cc0-4c89-8757-7cc78a79a94f-operator-scripts\") pod \"2a1b17cf-5cc0-4c89-8757-7cc78a79a94f\" (UID: \"2a1b17cf-5cc0-4c89-8757-7cc78a79a94f\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.219029 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f-operator-scripts\") pod \"e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f\" (UID: \"e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.219052 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znv47\" (UniqueName: \"kubernetes.io/projected/2a1b17cf-5cc0-4c89-8757-7cc78a79a94f-kube-api-access-znv47\") pod \"2a1b17cf-5cc0-4c89-8757-7cc78a79a94f\" (UID: \"2a1b17cf-5cc0-4c89-8757-7cc78a79a94f\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.219076 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25419741-acb3-497c-b0cf-c2bf78d58bd1-operator-scripts\") pod \"25419741-acb3-497c-b0cf-c2bf78d58bd1\" (UID: \"25419741-acb3-497c-b0cf-c2bf78d58bd1\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.219093 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc655c4d-3dd7-40c6-85c9-d53daedf8a65-operator-scripts\") pod \"dc655c4d-3dd7-40c6-85c9-d53daedf8a65\" (UID: \"dc655c4d-3dd7-40c6-85c9-d53daedf8a65\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.219128 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csztr\" (UniqueName: \"kubernetes.io/projected/e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f-kube-api-access-csztr\") pod \"e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f\" (UID: \"e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.219148 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qrzp\" (UniqueName: \"kubernetes.io/projected/27a4dd8e-f8bf-4695-8883-da720a6e1efd-kube-api-access-6qrzp\") pod \"27a4dd8e-f8bf-4695-8883-da720a6e1efd\" (UID: \"27a4dd8e-f8bf-4695-8883-da720a6e1efd\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.219173 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxnnc\" (UniqueName: \"kubernetes.io/projected/dc655c4d-3dd7-40c6-85c9-d53daedf8a65-kube-api-access-jxnnc\") pod \"dc655c4d-3dd7-40c6-85c9-d53daedf8a65\" (UID: \"dc655c4d-3dd7-40c6-85c9-d53daedf8a65\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.220174 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc655c4d-3dd7-40c6-85c9-d53daedf8a65-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc655c4d-3dd7-40c6-85c9-d53daedf8a65" (UID: "dc655c4d-3dd7-40c6-85c9-d53daedf8a65"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.220921 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9da0c545-5faf-43e4-afbb-f016c457a9e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9da0c545-5faf-43e4-afbb-f016c457a9e0" (UID: "9da0c545-5faf-43e4-afbb-f016c457a9e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.221143 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25419741-acb3-497c-b0cf-c2bf78d58bd1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25419741-acb3-497c-b0cf-c2bf78d58bd1" (UID: "25419741-acb3-497c-b0cf-c2bf78d58bd1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.221220 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a4dd8e-f8bf-4695-8883-da720a6e1efd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27a4dd8e-f8bf-4695-8883-da720a6e1efd" (UID: "27a4dd8e-f8bf-4695-8883-da720a6e1efd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.221285 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a1b17cf-5cc0-4c89-8757-7cc78a79a94f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a1b17cf-5cc0-4c89-8757-7cc78a79a94f" (UID: "2a1b17cf-5cc0-4c89-8757-7cc78a79a94f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.224202 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da0c545-5faf-43e4-afbb-f016c457a9e0-kube-api-access-cfwbq" (OuterVolumeSpecName: "kube-api-access-cfwbq") pod "9da0c545-5faf-43e4-afbb-f016c457a9e0" (UID: "9da0c545-5faf-43e4-afbb-f016c457a9e0"). InnerVolumeSpecName "kube-api-access-cfwbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.224958 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a4dd8e-f8bf-4695-8883-da720a6e1efd-kube-api-access-6qrzp" (OuterVolumeSpecName: "kube-api-access-6qrzp") pod "27a4dd8e-f8bf-4695-8883-da720a6e1efd" (UID: "27a4dd8e-f8bf-4695-8883-da720a6e1efd"). InnerVolumeSpecName "kube-api-access-6qrzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.225496 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25419741-acb3-497c-b0cf-c2bf78d58bd1-kube-api-access-wql69" (OuterVolumeSpecName: "kube-api-access-wql69") pod "25419741-acb3-497c-b0cf-c2bf78d58bd1" (UID: "25419741-acb3-497c-b0cf-c2bf78d58bd1"). InnerVolumeSpecName "kube-api-access-wql69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.226146 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f-kube-api-access-csztr" (OuterVolumeSpecName: "kube-api-access-csztr") pod "e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f" (UID: "e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f"). InnerVolumeSpecName "kube-api-access-csztr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.226504 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc655c4d-3dd7-40c6-85c9-d53daedf8a65-kube-api-access-jxnnc" (OuterVolumeSpecName: "kube-api-access-jxnnc") pod "dc655c4d-3dd7-40c6-85c9-d53daedf8a65" (UID: "dc655c4d-3dd7-40c6-85c9-d53daedf8a65"). InnerVolumeSpecName "kube-api-access-jxnnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.227798 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f" (UID: "e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.239058 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1b17cf-5cc0-4c89-8757-7cc78a79a94f-kube-api-access-znv47" (OuterVolumeSpecName: "kube-api-access-znv47") pod "2a1b17cf-5cc0-4c89-8757-7cc78a79a94f" (UID: "2a1b17cf-5cc0-4c89-8757-7cc78a79a94f"). InnerVolumeSpecName "kube-api-access-znv47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.320608 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-config-data\") pod \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.320746 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-db-sync-config-data\") pod \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.320769 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2w8w\" (UniqueName: \"kubernetes.io/projected/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-kube-api-access-h2w8w\") pod \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.320825 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-combined-ca-bundle\") pod \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\" (UID: \"be73b1fb-9f01-4e2b-a4fa-7f004be742e3\") " Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.321236 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b17cf-5cc0-4c89-8757-7cc78a79a94f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.321257 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.321266 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znv47\" (UniqueName: \"kubernetes.io/projected/2a1b17cf-5cc0-4c89-8757-7cc78a79a94f-kube-api-access-znv47\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.321277 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25419741-acb3-497c-b0cf-c2bf78d58bd1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.321286 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc655c4d-3dd7-40c6-85c9-d53daedf8a65-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.321295 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csztr\" (UniqueName: \"kubernetes.io/projected/e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f-kube-api-access-csztr\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.321304 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qrzp\" (UniqueName: \"kubernetes.io/projected/27a4dd8e-f8bf-4695-8883-da720a6e1efd-kube-api-access-6qrzp\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.321312 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxnnc\" (UniqueName: \"kubernetes.io/projected/dc655c4d-3dd7-40c6-85c9-d53daedf8a65-kube-api-access-jxnnc\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.321329 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9da0c545-5faf-43e4-afbb-f016c457a9e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.321338 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wql69\" (UniqueName: \"kubernetes.io/projected/25419741-acb3-497c-b0cf-c2bf78d58bd1-kube-api-access-wql69\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.321346 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfwbq\" (UniqueName: \"kubernetes.io/projected/9da0c545-5faf-43e4-afbb-f016c457a9e0-kube-api-access-cfwbq\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.321354 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a4dd8e-f8bf-4695-8883-da720a6e1efd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.324045 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "be73b1fb-9f01-4e2b-a4fa-7f004be742e3" (UID: "be73b1fb-9f01-4e2b-a4fa-7f004be742e3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.324733 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-kube-api-access-h2w8w" (OuterVolumeSpecName: "kube-api-access-h2w8w") pod "be73b1fb-9f01-4e2b-a4fa-7f004be742e3" (UID: "be73b1fb-9f01-4e2b-a4fa-7f004be742e3"). InnerVolumeSpecName "kube-api-access-h2w8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.343351 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be73b1fb-9f01-4e2b-a4fa-7f004be742e3" (UID: "be73b1fb-9f01-4e2b-a4fa-7f004be742e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.362441 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-config-data" (OuterVolumeSpecName: "config-data") pod "be73b1fb-9f01-4e2b-a4fa-7f004be742e3" (UID: "be73b1fb-9f01-4e2b-a4fa-7f004be742e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.422736 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.422780 4941 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.422802 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2w8w\" (UniqueName: \"kubernetes.io/projected/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-kube-api-access-h2w8w\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.422814 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be73b1fb-9f01-4e2b-a4fa-7f004be742e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.737972 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tz4v7" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.737972 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tz4v7" event={"ID":"be73b1fb-9f01-4e2b-a4fa-7f004be742e3","Type":"ContainerDied","Data":"6f3d3f0bb30de897df6b3beee6e1b4634700642f66d1e8aaed137e84860c9cf3"} Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.738539 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f3d3f0bb30de897df6b3beee6e1b4634700642f66d1e8aaed137e84860c9cf3" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.739751 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1a10-account-create-update-mk8sb" event={"ID":"2a1b17cf-5cc0-4c89-8757-7cc78a79a94f","Type":"ContainerDied","Data":"9a0519b9a5822194247e1e78e18354e8ff3915730de6954f9693e2632c8a9830"} Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.739972 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a10-account-create-update-mk8sb" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.740054 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a0519b9a5822194247e1e78e18354e8ff3915730de6954f9693e2632c8a9830" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.741269 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-mhtvx" event={"ID":"25419741-acb3-497c-b0cf-c2bf78d58bd1","Type":"ContainerDied","Data":"29138e79c853da65bd39aef90dfdb322c7371465f46012c2d5f7f7393c9bca1e"} Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.741290 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29138e79c853da65bd39aef90dfdb322c7371465f46012c2d5f7f7393c9bca1e" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.741355 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mhtvx" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.743542 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rgx2z" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.743524 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rgx2z" event={"ID":"e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f","Type":"ContainerDied","Data":"cc281062d9df3cc9b7df49697eb82995146cfd1d8e5d7a470ec26ec59d1e5504"} Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.743770 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc281062d9df3cc9b7df49697eb82995146cfd1d8e5d7a470ec26ec59d1e5504" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.744917 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b246-account-create-update-c975h" event={"ID":"9da0c545-5faf-43e4-afbb-f016c457a9e0","Type":"ContainerDied","Data":"2388a507cf94f24318a55c31cd282a95d6f52584448dd061ba441d49a968c708"} Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.744931 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b246-account-create-update-c975h" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.744938 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2388a507cf94f24318a55c31cd282a95d6f52584448dd061ba441d49a968c708" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.746152 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4bnlh" event={"ID":"27a4dd8e-f8bf-4695-8883-da720a6e1efd","Type":"ContainerDied","Data":"b0e306f0430f278fffadd34110bd43b7809f000012eb4cd31958a378962c985d"} Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.746175 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0e306f0430f278fffadd34110bd43b7809f000012eb4cd31958a378962c985d" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.746222 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4bnlh" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.748502 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lbdr9" event={"ID":"8819def4-42df-4a8f-b5d0-21db1e1ca87a","Type":"ContainerStarted","Data":"73cfdb16aec3ceb4e0b85e78966b407c79934267edb526cc91fff8ba1abf81d4"} Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.750739 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1815-account-create-update-hdrpv" event={"ID":"dc655c4d-3dd7-40c6-85c9-d53daedf8a65","Type":"ContainerDied","Data":"c1d47bb0cf520182595206d828f73b1a50a57cae6398dde4f3342f4cea6bf0d9"} Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.750779 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1d47bb0cf520182595206d828f73b1a50a57cae6398dde4f3342f4cea6bf0d9" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.750754 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1815-account-create-update-hdrpv" Mar 07 07:12:29 crc kubenswrapper[4941]: I0307 07:12:29.774587 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-lbdr9" podStartSLOduration=2.5698076949999997 podStartE2EDuration="6.774564575s" podCreationTimestamp="2026-03-07 07:12:23 +0000 UTC" firstStartedPulling="2026-03-07 07:12:24.697525786 +0000 UTC m=+1241.649891241" lastFinishedPulling="2026-03-07 07:12:28.902282656 +0000 UTC m=+1245.854648121" observedRunningTime="2026-03-07 07:12:29.762585771 +0000 UTC m=+1246.714951246" watchObservedRunningTime="2026-03-07 07:12:29.774564575 +0000 UTC m=+1246.726930050" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.196711 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.258035 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-htc9x"] Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.258301 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f7dd995-htc9x" podUID="1458c12c-70fd-4cc9-b886-88f99711104f" containerName="dnsmasq-dns" containerID="cri-o://76c58b9a930b18c64f775408126e8cec528882fdb608a784350559dcc0d3e016" gracePeriod=10 Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.587507 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dfff6465-xg5bw"] Mar 07 07:12:30 crc kubenswrapper[4941]: E0307 07:12:30.588205 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1b17cf-5cc0-4c89-8757-7cc78a79a94f" containerName="mariadb-account-create-update" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.588221 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1b17cf-5cc0-4c89-8757-7cc78a79a94f" containerName="mariadb-account-create-update" Mar 07 07:12:30 crc kubenswrapper[4941]: E0307 07:12:30.588238 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a4dd8e-f8bf-4695-8883-da720a6e1efd" containerName="mariadb-database-create" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.588246 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a4dd8e-f8bf-4695-8883-da720a6e1efd" containerName="mariadb-database-create" Mar 07 07:12:30 crc kubenswrapper[4941]: E0307 07:12:30.588256 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f" containerName="mariadb-database-create" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.588263 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f" containerName="mariadb-database-create" Mar 07 07:12:30 crc kubenswrapper[4941]: E0307 07:12:30.588276 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be73b1fb-9f01-4e2b-a4fa-7f004be742e3" containerName="glance-db-sync" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.588283 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="be73b1fb-9f01-4e2b-a4fa-7f004be742e3" containerName="glance-db-sync" Mar 07 07:12:30 crc kubenswrapper[4941]: E0307 07:12:30.588294 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da0c545-5faf-43e4-afbb-f016c457a9e0" containerName="mariadb-account-create-update" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.588301 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da0c545-5faf-43e4-afbb-f016c457a9e0" containerName="mariadb-account-create-update" Mar 07 07:12:30 crc kubenswrapper[4941]: E0307 07:12:30.588316 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3" containerName="ovn-config" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.588323 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3" containerName="ovn-config" Mar 07 07:12:30 crc kubenswrapper[4941]: E0307 07:12:30.588335 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25419741-acb3-497c-b0cf-c2bf78d58bd1" containerName="mariadb-database-create" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.588343 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="25419741-acb3-497c-b0cf-c2bf78d58bd1" containerName="mariadb-database-create" Mar 07 07:12:30 crc kubenswrapper[4941]: E0307 07:12:30.588354 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc655c4d-3dd7-40c6-85c9-d53daedf8a65" containerName="mariadb-account-create-update" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.588361 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc655c4d-3dd7-40c6-85c9-d53daedf8a65" containerName="mariadb-account-create-update" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.621963 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="25419741-acb3-497c-b0cf-c2bf78d58bd1" containerName="mariadb-database-create" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.622045 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="be73b1fb-9f01-4e2b-a4fa-7f004be742e3" containerName="glance-db-sync" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.622088 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da0c545-5faf-43e4-afbb-f016c457a9e0" containerName="mariadb-account-create-update" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.622115 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1b17cf-5cc0-4c89-8757-7cc78a79a94f" containerName="mariadb-account-create-update" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.622135 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc655c4d-3dd7-40c6-85c9-d53daedf8a65" containerName="mariadb-account-create-update" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.622155 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f" containerName="mariadb-database-create" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.622179 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a4dd8e-f8bf-4695-8883-da720a6e1efd" containerName="mariadb-database-create" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.622314 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bdb1f62-f7ff-4d83-9428-7fbad7d54ab3" containerName="ovn-config" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.624310 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.641073 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dfff6465-xg5bw"] Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.752646 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-dns-swift-storage-0\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.752697 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s594r\" (UniqueName: \"kubernetes.io/projected/b762f743-61e9-4b20-8812-85ab6edb8e04-kube-api-access-s594r\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.752725 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-dns-svc\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.752771 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-config\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.752810 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-ovsdbserver-nb\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.752867 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-ovsdbserver-sb\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.767663 4941 generic.go:334] "Generic (PLEG): container finished" podID="1458c12c-70fd-4cc9-b886-88f99711104f" containerID="76c58b9a930b18c64f775408126e8cec528882fdb608a784350559dcc0d3e016" exitCode=0 Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.768244 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-htc9x" event={"ID":"1458c12c-70fd-4cc9-b886-88f99711104f","Type":"ContainerDied","Data":"76c58b9a930b18c64f775408126e8cec528882fdb608a784350559dcc0d3e016"} Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.857808 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-ovsdbserver-nb\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.857890 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-ovsdbserver-sb\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.857951 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-dns-swift-storage-0\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.857975 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s594r\" (UniqueName: \"kubernetes.io/projected/b762f743-61e9-4b20-8812-85ab6edb8e04-kube-api-access-s594r\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.858008 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-dns-svc\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.858047 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-config\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.858896 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-config\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.859192 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-dns-swift-storage-0\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.861904 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-ovsdbserver-sb\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.862900 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-dns-svc\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.863676 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-ovsdbserver-nb\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.881818 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.892270 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s594r\" (UniqueName: \"kubernetes.io/projected/b762f743-61e9-4b20-8812-85ab6edb8e04-kube-api-access-s594r\") pod \"dnsmasq-dns-dfff6465-xg5bw\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.958792 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-ovsdbserver-nb\") pod \"1458c12c-70fd-4cc9-b886-88f99711104f\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.958861 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-config\") pod \"1458c12c-70fd-4cc9-b886-88f99711104f\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.958896 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q74qd\" (UniqueName: \"kubernetes.io/projected/1458c12c-70fd-4cc9-b886-88f99711104f-kube-api-access-q74qd\") pod \"1458c12c-70fd-4cc9-b886-88f99711104f\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.958975 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-ovsdbserver-sb\") pod \"1458c12c-70fd-4cc9-b886-88f99711104f\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.959003 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-dns-svc\") pod \"1458c12c-70fd-4cc9-b886-88f99711104f\" (UID: \"1458c12c-70fd-4cc9-b886-88f99711104f\") " Mar 07 07:12:30 crc kubenswrapper[4941]: I0307 07:12:30.979222 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.136739 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1458c12c-70fd-4cc9-b886-88f99711104f-kube-api-access-q74qd" (OuterVolumeSpecName: "kube-api-access-q74qd") pod "1458c12c-70fd-4cc9-b886-88f99711104f" (UID: "1458c12c-70fd-4cc9-b886-88f99711104f"). InnerVolumeSpecName "kube-api-access-q74qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.138283 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q74qd\" (UniqueName: \"kubernetes.io/projected/1458c12c-70fd-4cc9-b886-88f99711104f-kube-api-access-q74qd\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.170437 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-config" (OuterVolumeSpecName: "config") pod "1458c12c-70fd-4cc9-b886-88f99711104f" (UID: "1458c12c-70fd-4cc9-b886-88f99711104f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.179251 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1458c12c-70fd-4cc9-b886-88f99711104f" (UID: "1458c12c-70fd-4cc9-b886-88f99711104f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.191909 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1458c12c-70fd-4cc9-b886-88f99711104f" (UID: "1458c12c-70fd-4cc9-b886-88f99711104f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.205378 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1458c12c-70fd-4cc9-b886-88f99711104f" (UID: "1458c12c-70fd-4cc9-b886-88f99711104f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.239495 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.239527 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.239538 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.239546 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1458c12c-70fd-4cc9-b886-88f99711104f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.649650 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dfff6465-xg5bw"] Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.776960 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dfff6465-xg5bw" event={"ID":"b762f743-61e9-4b20-8812-85ab6edb8e04","Type":"ContainerStarted","Data":"3106989038ec2fb62798e60aeebdfc422d81d2f1d178e72dc7829ba3d38e88c6"} Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.778668 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-htc9x" event={"ID":"1458c12c-70fd-4cc9-b886-88f99711104f","Type":"ContainerDied","Data":"91ae9e6ac2d97c6d36a4671564bc8e0275de1f13f9005b91481b06bcd9d63cb2"} Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.778725 4941 scope.go:117] "RemoveContainer" containerID="76c58b9a930b18c64f775408126e8cec528882fdb608a784350559dcc0d3e016" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.778886 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-htc9x" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.809790 4941 scope.go:117] "RemoveContainer" containerID="9eb14d74e3a81aeea91e4a8d7aee8449a7e5e24a4dec781e91b80b616894f3ea" Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.816973 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-htc9x"] Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.824536 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-htc9x"] Mar 07 07:12:31 crc kubenswrapper[4941]: I0307 07:12:31.965571 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1458c12c-70fd-4cc9-b886-88f99711104f" path="/var/lib/kubelet/pods/1458c12c-70fd-4cc9-b886-88f99711104f/volumes" Mar 07 07:12:32 crc kubenswrapper[4941]: I0307 07:12:32.791369 4941 generic.go:334] "Generic (PLEG): container finished" podID="b762f743-61e9-4b20-8812-85ab6edb8e04" containerID="040dc80080e7d7b4c1466a949d5283e136322dde0c80a7f5e3acdec21688d192" exitCode=0 Mar 07 07:12:32 crc kubenswrapper[4941]: I0307 07:12:32.791521 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dfff6465-xg5bw" event={"ID":"b762f743-61e9-4b20-8812-85ab6edb8e04","Type":"ContainerDied","Data":"040dc80080e7d7b4c1466a949d5283e136322dde0c80a7f5e3acdec21688d192"} Mar 07 07:12:33 crc kubenswrapper[4941]: I0307 07:12:33.804533 4941 generic.go:334] "Generic (PLEG): container finished" podID="8819def4-42df-4a8f-b5d0-21db1e1ca87a" containerID="73cfdb16aec3ceb4e0b85e78966b407c79934267edb526cc91fff8ba1abf81d4" exitCode=0 Mar 07 07:12:33 crc kubenswrapper[4941]: I0307 07:12:33.804658 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lbdr9" event={"ID":"8819def4-42df-4a8f-b5d0-21db1e1ca87a","Type":"ContainerDied","Data":"73cfdb16aec3ceb4e0b85e78966b407c79934267edb526cc91fff8ba1abf81d4"} Mar 07 07:12:33 crc kubenswrapper[4941]: I0307 07:12:33.807170 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dfff6465-xg5bw" event={"ID":"b762f743-61e9-4b20-8812-85ab6edb8e04","Type":"ContainerStarted","Data":"8ce4203dcdd436e35ed42ed66da874262a1aba403719aff3b186ae4baec89435"} Mar 07 07:12:33 crc kubenswrapper[4941]: I0307 07:12:33.807333 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:33 crc kubenswrapper[4941]: I0307 07:12:33.853554 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dfff6465-xg5bw" podStartSLOduration=3.853522593 podStartE2EDuration="3.853522593s" podCreationTimestamp="2026-03-07 07:12:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:12:33.848468864 +0000 UTC m=+1250.800834369" watchObservedRunningTime="2026-03-07 07:12:33.853522593 +0000 UTC m=+1250.805888068" Mar 07 07:12:35 crc kubenswrapper[4941]: I0307 07:12:35.219764 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lbdr9" Mar 07 07:12:35 crc kubenswrapper[4941]: I0307 07:12:35.310655 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh8n7\" (UniqueName: \"kubernetes.io/projected/8819def4-42df-4a8f-b5d0-21db1e1ca87a-kube-api-access-gh8n7\") pod \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\" (UID: \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\") " Mar 07 07:12:35 crc kubenswrapper[4941]: I0307 07:12:35.311062 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8819def4-42df-4a8f-b5d0-21db1e1ca87a-config-data\") pod \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\" (UID: \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\") " Mar 07 07:12:35 crc kubenswrapper[4941]: I0307 07:12:35.311106 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8819def4-42df-4a8f-b5d0-21db1e1ca87a-combined-ca-bundle\") pod \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\" (UID: \"8819def4-42df-4a8f-b5d0-21db1e1ca87a\") " Mar 07 07:12:35 crc kubenswrapper[4941]: I0307 07:12:35.315936 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8819def4-42df-4a8f-b5d0-21db1e1ca87a-kube-api-access-gh8n7" (OuterVolumeSpecName: "kube-api-access-gh8n7") pod "8819def4-42df-4a8f-b5d0-21db1e1ca87a" (UID: "8819def4-42df-4a8f-b5d0-21db1e1ca87a"). InnerVolumeSpecName "kube-api-access-gh8n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:35 crc kubenswrapper[4941]: I0307 07:12:35.338043 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8819def4-42df-4a8f-b5d0-21db1e1ca87a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8819def4-42df-4a8f-b5d0-21db1e1ca87a" (UID: "8819def4-42df-4a8f-b5d0-21db1e1ca87a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:35 crc kubenswrapper[4941]: I0307 07:12:35.362384 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8819def4-42df-4a8f-b5d0-21db1e1ca87a-config-data" (OuterVolumeSpecName: "config-data") pod "8819def4-42df-4a8f-b5d0-21db1e1ca87a" (UID: "8819def4-42df-4a8f-b5d0-21db1e1ca87a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:35 crc kubenswrapper[4941]: I0307 07:12:35.413644 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh8n7\" (UniqueName: \"kubernetes.io/projected/8819def4-42df-4a8f-b5d0-21db1e1ca87a-kube-api-access-gh8n7\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:35 crc kubenswrapper[4941]: I0307 07:12:35.413673 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8819def4-42df-4a8f-b5d0-21db1e1ca87a-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:35 crc kubenswrapper[4941]: I0307 07:12:35.413682 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8819def4-42df-4a8f-b5d0-21db1e1ca87a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:35 crc kubenswrapper[4941]: I0307 07:12:35.823113 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lbdr9" event={"ID":"8819def4-42df-4a8f-b5d0-21db1e1ca87a","Type":"ContainerDied","Data":"0e8e044df5054e44d861059cc63f8078f9ce8f91ae87f91d028f0b23a98d0910"} Mar 07 07:12:35 crc kubenswrapper[4941]: I0307 07:12:35.823355 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lbdr9" Mar 07 07:12:35 crc kubenswrapper[4941]: I0307 07:12:35.823551 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e8e044df5054e44d861059cc63f8078f9ce8f91ae87f91d028f0b23a98d0910" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.074549 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dfff6465-xg5bw"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.074786 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dfff6465-xg5bw" podUID="b762f743-61e9-4b20-8812-85ab6edb8e04" containerName="dnsmasq-dns" containerID="cri-o://8ce4203dcdd436e35ed42ed66da874262a1aba403719aff3b186ae4baec89435" gracePeriod=10 Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.116320 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jqlt7"] Mar 07 07:12:36 crc kubenswrapper[4941]: E0307 07:12:36.116930 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1458c12c-70fd-4cc9-b886-88f99711104f" containerName="dnsmasq-dns" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.116946 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1458c12c-70fd-4cc9-b886-88f99711104f" containerName="dnsmasq-dns" Mar 07 07:12:36 crc kubenswrapper[4941]: E0307 07:12:36.116967 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8819def4-42df-4a8f-b5d0-21db1e1ca87a" containerName="keystone-db-sync" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.116974 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="8819def4-42df-4a8f-b5d0-21db1e1ca87a" containerName="keystone-db-sync" Mar 07 07:12:36 crc kubenswrapper[4941]: E0307 07:12:36.116994 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1458c12c-70fd-4cc9-b886-88f99711104f" containerName="init" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.117002 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1458c12c-70fd-4cc9-b886-88f99711104f" containerName="init" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.117173 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="1458c12c-70fd-4cc9-b886-88f99711104f" containerName="dnsmasq-dns" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.117190 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="8819def4-42df-4a8f-b5d0-21db1e1ca87a" containerName="keystone-db-sync" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.117840 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.119819 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.122248 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.122388 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.122451 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.122603 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lf9xv" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.137817 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67c555b79c-cqhpf"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.139179 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.148577 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jqlt7"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.184184 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67c555b79c-cqhpf"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.224304 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb9ks\" (UniqueName: \"kubernetes.io/projected/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-kube-api-access-sb9ks\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.224339 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-credential-keys\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.224391 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-fernet-keys\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.224438 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-scripts\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.224464 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-config-data\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.224690 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-combined-ca-bundle\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.326179 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-scripts\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.326234 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-config-data\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.326259 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-config\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.326304 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-combined-ca-bundle\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.326345 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-dns-swift-storage-0\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.326364 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzsct\" (UniqueName: \"kubernetes.io/projected/bb809502-d1a2-42e2-b60d-0ac873fdd73b-kube-api-access-kzsct\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.326383 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-ovsdbserver-nb\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.326414 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb9ks\" (UniqueName: \"kubernetes.io/projected/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-kube-api-access-sb9ks\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.326432 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-dns-svc\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.326452 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-credential-keys\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.326486 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-ovsdbserver-sb\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.326511 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-fernet-keys\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.329848 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-k7f6s"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.330811 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.331005 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-fernet-keys\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.336205 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-combined-ca-bundle\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.339842 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-66txm" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.340024 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.341136 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-scripts\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.346123 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-config-data\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.351144 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.373139 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-credential-keys\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.375435 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb9ks\" (UniqueName: \"kubernetes.io/projected/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-kube-api-access-sb9ks\") pod \"keystone-bootstrap-jqlt7\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.379615 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jswzb"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.394706 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jswzb" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.399761 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-m482l" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.399979 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.400161 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.437474 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-config\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.439686 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-config\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.440207 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-dns-swift-storage-0\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.437558 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-dns-swift-storage-0\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.440969 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzsct\" (UniqueName: \"kubernetes.io/projected/bb809502-d1a2-42e2-b60d-0ac873fdd73b-kube-api-access-kzsct\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.441004 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-ovsdbserver-nb\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.441025 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-dns-svc\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.441082 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-ovsdbserver-sb\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.441693 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-ovsdbserver-sb\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.442528 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-ovsdbserver-nb\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.443135 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-dns-svc\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.450483 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-k7f6s"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.495565 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.510136 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzsct\" (UniqueName: \"kubernetes.io/projected/bb809502-d1a2-42e2-b60d-0ac873fdd73b-kube-api-access-kzsct\") pod \"dnsmasq-dns-67c555b79c-cqhpf\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.525484 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.527508 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.535788 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.536211 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.536377 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.542883 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl6rp\" (UniqueName: \"kubernetes.io/projected/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-kube-api-access-jl6rp\") pod \"neutron-db-sync-jswzb\" (UID: \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\") " pod="openstack/neutron-db-sync-jswzb" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.542926 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-scripts\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.542952 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-config-data\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.542975 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-756qc\" (UniqueName: \"kubernetes.io/projected/84812b46-cde1-4da2-9a4d-e0e6013c56fe-kube-api-access-756qc\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.542995 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84812b46-cde1-4da2-9a4d-e0e6013c56fe-log-httpd\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.543011 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpjls\" (UniqueName: \"kubernetes.io/projected/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-kube-api-access-lpjls\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.543027 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.543045 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-combined-ca-bundle\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.543064 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-scripts\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.543087 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-config-data\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.543111 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-db-sync-config-data\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.547307 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.547454 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-config\") pod \"neutron-db-sync-jswzb\" (UID: \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\") " pod="openstack/neutron-db-sync-jswzb" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.547495 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-etc-machine-id\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.547544 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84812b46-cde1-4da2-9a4d-e0e6013c56fe-run-httpd\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.547582 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-combined-ca-bundle\") pod \"neutron-db-sync-jswzb\" (UID: \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\") " pod="openstack/neutron-db-sync-jswzb" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.578564 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jswzb"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.587428 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.609541 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-n6fhd"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.610643 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n6fhd" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.612728 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jd92q" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.612881 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.618561 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-n6fhd"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.625832 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67c555b79c-cqhpf"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.635218 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-754b99d75-mlbz8"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.636594 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.642737 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-27tlg"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.643691 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.646997 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.648656 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-jh569" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650303 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-combined-ca-bundle\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650355 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-config\") pod \"neutron-db-sync-jswzb\" (UID: \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\") " pod="openstack/neutron-db-sync-jswzb" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650378 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr585\" (UniqueName: \"kubernetes.io/projected/a038eb59-eed0-442b-9076-5e5091511b2b-kube-api-access-vr585\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650417 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-etc-machine-id\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650437 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-scripts\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650462 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84812b46-cde1-4da2-9a4d-e0e6013c56fe-run-httpd\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650479 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-dns-swift-storage-0\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650522 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-etc-machine-id\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650562 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-combined-ca-bundle\") pod \"neutron-db-sync-jswzb\" (UID: \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\") " pod="openstack/neutron-db-sync-jswzb" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650601 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntjlt\" (UniqueName: \"kubernetes.io/projected/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-kube-api-access-ntjlt\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650676 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-config-data\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650725 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl6rp\" (UniqueName: \"kubernetes.io/projected/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-kube-api-access-jl6rp\") pod \"neutron-db-sync-jswzb\" (UID: \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\") " pod="openstack/neutron-db-sync-jswzb" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650747 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d89d1d4-04b0-4778-98d7-1cc12db0588b-db-sync-config-data\") pod \"barbican-db-sync-n6fhd\" (UID: \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\") " pod="openstack/barbican-db-sync-n6fhd" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650784 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-scripts\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650815 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d89d1d4-04b0-4778-98d7-1cc12db0588b-combined-ca-bundle\") pod \"barbican-db-sync-n6fhd\" (UID: \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\") " pod="openstack/barbican-db-sync-n6fhd" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650834 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-config-data\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.650852 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84812b46-cde1-4da2-9a4d-e0e6013c56fe-run-httpd\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.651516 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-756qc\" (UniqueName: \"kubernetes.io/projected/84812b46-cde1-4da2-9a4d-e0e6013c56fe-kube-api-access-756qc\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.651615 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-dns-svc\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.651657 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-ovsdbserver-nb\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.651695 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84812b46-cde1-4da2-9a4d-e0e6013c56fe-log-httpd\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.651718 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn56w\" (UniqueName: \"kubernetes.io/projected/3d89d1d4-04b0-4778-98d7-1cc12db0588b-kube-api-access-nn56w\") pod \"barbican-db-sync-n6fhd\" (UID: \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\") " pod="openstack/barbican-db-sync-n6fhd" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.651752 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpjls\" (UniqueName: \"kubernetes.io/projected/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-kube-api-access-lpjls\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.651779 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.651815 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-combined-ca-bundle\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.651837 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-ovsdbserver-sb\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.651867 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-logs\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.651898 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-scripts\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.651956 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-config-data\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.651985 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-config\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.652034 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-db-sync-config-data\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.652059 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.655892 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.656707 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-combined-ca-bundle\") pod \"neutron-db-sync-jswzb\" (UID: \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\") " pod="openstack/neutron-db-sync-jswzb" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.657533 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84812b46-cde1-4da2-9a4d-e0e6013c56fe-log-httpd\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.657916 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-config\") pod \"neutron-db-sync-jswzb\" (UID: \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\") " pod="openstack/neutron-db-sync-jswzb" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.658994 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.667050 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-754b99d75-mlbz8"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.667600 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-scripts\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.669955 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-config-data\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.678041 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.682333 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-scripts\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.682838 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-config-data\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.682961 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-db-sync-config-data\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.683378 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-combined-ca-bundle\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.685274 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpjls\" (UniqueName: \"kubernetes.io/projected/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-kube-api-access-lpjls\") pod \"cinder-db-sync-k7f6s\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.685731 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl6rp\" (UniqueName: \"kubernetes.io/projected/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-kube-api-access-jl6rp\") pod \"neutron-db-sync-jswzb\" (UID: \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\") " pod="openstack/neutron-db-sync-jswzb" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.686821 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-756qc\" (UniqueName: \"kubernetes.io/projected/84812b46-cde1-4da2-9a4d-e0e6013c56fe-kube-api-access-756qc\") pod \"ceilometer-0\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " pod="openstack/ceilometer-0" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.690022 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-27tlg"] Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753371 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjlt\" (UniqueName: \"kubernetes.io/projected/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-kube-api-access-ntjlt\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753438 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-config-data\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753461 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d89d1d4-04b0-4778-98d7-1cc12db0588b-db-sync-config-data\") pod \"barbican-db-sync-n6fhd\" (UID: \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\") " pod="openstack/barbican-db-sync-n6fhd" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753490 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d89d1d4-04b0-4778-98d7-1cc12db0588b-combined-ca-bundle\") pod \"barbican-db-sync-n6fhd\" (UID: \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\") " pod="openstack/barbican-db-sync-n6fhd" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753521 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-dns-svc\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753539 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-ovsdbserver-nb\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753556 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn56w\" (UniqueName: \"kubernetes.io/projected/3d89d1d4-04b0-4778-98d7-1cc12db0588b-kube-api-access-nn56w\") pod \"barbican-db-sync-n6fhd\" (UID: \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\") " pod="openstack/barbican-db-sync-n6fhd" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753586 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-ovsdbserver-sb\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753604 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-logs\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753642 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-config\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753682 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-combined-ca-bundle\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753705 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr585\" (UniqueName: \"kubernetes.io/projected/a038eb59-eed0-442b-9076-5e5091511b2b-kube-api-access-vr585\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753727 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-scripts\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.753753 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-dns-swift-storage-0\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.754543 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-dns-swift-storage-0\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.756800 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-logs\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.757262 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-ovsdbserver-sb\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.757262 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-config\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.758578 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-ovsdbserver-nb\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.759203 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-dns-svc\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.759849 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d89d1d4-04b0-4778-98d7-1cc12db0588b-db-sync-config-data\") pod \"barbican-db-sync-n6fhd\" (UID: \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\") " pod="openstack/barbican-db-sync-n6fhd" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.759931 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-config-data\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.764252 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-scripts\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.764869 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-combined-ca-bundle\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.767093 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d89d1d4-04b0-4778-98d7-1cc12db0588b-combined-ca-bundle\") pod \"barbican-db-sync-n6fhd\" (UID: \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\") " pod="openstack/barbican-db-sync-n6fhd" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.786588 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntjlt\" (UniqueName: \"kubernetes.io/projected/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-kube-api-access-ntjlt\") pod \"placement-db-sync-27tlg\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.786708 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn56w\" (UniqueName: \"kubernetes.io/projected/3d89d1d4-04b0-4778-98d7-1cc12db0588b-kube-api-access-nn56w\") pod \"barbican-db-sync-n6fhd\" (UID: \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\") " pod="openstack/barbican-db-sync-n6fhd" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.788585 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr585\" (UniqueName: \"kubernetes.io/projected/a038eb59-eed0-442b-9076-5e5091511b2b-kube-api-access-vr585\") pod \"dnsmasq-dns-754b99d75-mlbz8\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.847416 4941 generic.go:334] "Generic (PLEG): container finished" podID="b762f743-61e9-4b20-8812-85ab6edb8e04" containerID="8ce4203dcdd436e35ed42ed66da874262a1aba403719aff3b186ae4baec89435" exitCode=0 Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.847467 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dfff6465-xg5bw" event={"ID":"b762f743-61e9-4b20-8812-85ab6edb8e04","Type":"ContainerDied","Data":"8ce4203dcdd436e35ed42ed66da874262a1aba403719aff3b186ae4baec89435"} Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.916526 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.936963 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jswzb" Mar 07 07:12:36 crc kubenswrapper[4941]: I0307 07:12:36.950649 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.002864 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n6fhd" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.024251 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.038876 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-27tlg" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.151940 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jqlt7"] Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.160245 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67c555b79c-cqhpf"] Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.219962 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.222854 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.229142 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.229965 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7jf2v" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.230189 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.230384 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.274994 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.322025 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.323880 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.329822 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.334569 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.334806 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.367162 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3c39296-5d56-4f48-9a56-574e5d590482-logs\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.367268 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.367309 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.367333 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3c39296-5d56-4f48-9a56-574e5d590482-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.367362 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.367397 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.367489 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76vjb\" (UniqueName: \"kubernetes.io/projected/c3c39296-5d56-4f48-9a56-574e5d590482-kube-api-access-76vjb\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.367522 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.470132 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3c39296-5d56-4f48-9a56-574e5d590482-logs\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.470713 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3c39296-5d56-4f48-9a56-574e5d590482-logs\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.471514 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.471572 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwv92\" (UniqueName: \"kubernetes.io/projected/eb4102ed-3089-4b55-a29a-99c09d7243d5-kube-api-access-dwv92\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.471658 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.471721 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.471743 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.471763 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3c39296-5d56-4f48-9a56-574e5d590482-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.471796 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.471813 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.471857 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb4102ed-3089-4b55-a29a-99c09d7243d5-logs\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.471874 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.471962 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb4102ed-3089-4b55-a29a-99c09d7243d5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.471981 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.472026 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.472105 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76vjb\" (UniqueName: \"kubernetes.io/projected/c3c39296-5d56-4f48-9a56-574e5d590482-kube-api-access-76vjb\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.472144 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.472200 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.476089 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3c39296-5d56-4f48-9a56-574e5d590482-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.491682 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.492105 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.492550 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.493698 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.520998 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76vjb\" (UniqueName: \"kubernetes.io/projected/c3c39296-5d56-4f48-9a56-574e5d590482-kube-api-access-76vjb\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.542759 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.565458 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.574125 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb4102ed-3089-4b55-a29a-99c09d7243d5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.574183 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.574227 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.574265 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.574290 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwv92\" (UniqueName: \"kubernetes.io/projected/eb4102ed-3089-4b55-a29a-99c09d7243d5-kube-api-access-dwv92\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.574329 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.574349 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.574372 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb4102ed-3089-4b55-a29a-99c09d7243d5-logs\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.574797 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb4102ed-3089-4b55-a29a-99c09d7243d5-logs\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.575150 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.583817 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.584258 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb4102ed-3089-4b55-a29a-99c09d7243d5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.585204 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.594587 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.598497 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.598601 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwv92\" (UniqueName: \"kubernetes.io/projected/eb4102ed-3089-4b55-a29a-99c09d7243d5-kube-api-access-dwv92\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.608101 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.677901 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-ovsdbserver-nb\") pod \"b762f743-61e9-4b20-8812-85ab6edb8e04\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.677959 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-dns-svc\") pod \"b762f743-61e9-4b20-8812-85ab6edb8e04\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.678053 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-config\") pod \"b762f743-61e9-4b20-8812-85ab6edb8e04\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.678083 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-ovsdbserver-sb\") pod \"b762f743-61e9-4b20-8812-85ab6edb8e04\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.678114 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-dns-swift-storage-0\") pod \"b762f743-61e9-4b20-8812-85ab6edb8e04\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.678208 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s594r\" (UniqueName: \"kubernetes.io/projected/b762f743-61e9-4b20-8812-85ab6edb8e04-kube-api-access-s594r\") pod \"b762f743-61e9-4b20-8812-85ab6edb8e04\" (UID: \"b762f743-61e9-4b20-8812-85ab6edb8e04\") " Mar 07 07:12:37 crc kubenswrapper[4941]: W0307 07:12:37.684955 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e0538f9_8d7c_40cf_bc98_a165a41d1bf6.slice/crio-5870db5b8c8b29fde0cd60a93d792e082f4d6f245305d77e8ed5a4466819e3e2 WatchSource:0}: Error finding container 5870db5b8c8b29fde0cd60a93d792e082f4d6f245305d77e8ed5a4466819e3e2: Status 404 returned error can't find the container with id 5870db5b8c8b29fde0cd60a93d792e082f4d6f245305d77e8ed5a4466819e3e2 Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.692930 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-k7f6s"] Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.694732 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b762f743-61e9-4b20-8812-85ab6edb8e04-kube-api-access-s594r" (OuterVolumeSpecName: "kube-api-access-s594r") pod "b762f743-61e9-4b20-8812-85ab6edb8e04" (UID: "b762f743-61e9-4b20-8812-85ab6edb8e04"). InnerVolumeSpecName "kube-api-access-s594r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.743844 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-config" (OuterVolumeSpecName: "config") pod "b762f743-61e9-4b20-8812-85ab6edb8e04" (UID: "b762f743-61e9-4b20-8812-85ab6edb8e04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.769468 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b762f743-61e9-4b20-8812-85ab6edb8e04" (UID: "b762f743-61e9-4b20-8812-85ab6edb8e04"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.770089 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b762f743-61e9-4b20-8812-85ab6edb8e04" (UID: "b762f743-61e9-4b20-8812-85ab6edb8e04"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.770875 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b762f743-61e9-4b20-8812-85ab6edb8e04" (UID: "b762f743-61e9-4b20-8812-85ab6edb8e04"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.781473 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.781516 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.781529 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.781539 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.781551 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s594r\" (UniqueName: \"kubernetes.io/projected/b762f743-61e9-4b20-8812-85ab6edb8e04-kube-api-access-s594r\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.801387 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b762f743-61e9-4b20-8812-85ab6edb8e04" (UID: "b762f743-61e9-4b20-8812-85ab6edb8e04"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.807948 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.834943 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.855022 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-k7f6s" event={"ID":"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6","Type":"ContainerStarted","Data":"5870db5b8c8b29fde0cd60a93d792e082f4d6f245305d77e8ed5a4466819e3e2"} Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.860070 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jqlt7" event={"ID":"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad","Type":"ContainerStarted","Data":"a939d5a60382d4d0ae65153ddf26bff83616488dd1b6ee21ba8a2eae2b3cab17"} Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.861866 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" event={"ID":"bb809502-d1a2-42e2-b60d-0ac873fdd73b","Type":"ContainerStarted","Data":"d0a150542df0f9fa467c5484decc64258caa293ea201ca4867de27e68a85bfec"} Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.865317 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dfff6465-xg5bw" event={"ID":"b762f743-61e9-4b20-8812-85ab6edb8e04","Type":"ContainerDied","Data":"3106989038ec2fb62798e60aeebdfc422d81d2f1d178e72dc7829ba3d38e88c6"} Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.865358 4941 scope.go:117] "RemoveContainer" containerID="8ce4203dcdd436e35ed42ed66da874262a1aba403719aff3b186ae4baec89435" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.865504 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dfff6465-xg5bw" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.883369 4941 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b762f743-61e9-4b20-8812-85ab6edb8e04-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.904703 4941 scope.go:117] "RemoveContainer" containerID="040dc80080e7d7b4c1466a949d5283e136322dde0c80a7f5e3acdec21688d192" Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.913274 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dfff6465-xg5bw"] Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.924685 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dfff6465-xg5bw"] Mar 07 07:12:37 crc kubenswrapper[4941]: I0307 07:12:37.952473 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-754b99d75-mlbz8"] Mar 07 07:12:38 crc kubenswrapper[4941]: W0307 07:12:38.022059 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2dc5023_3f28_4e34_be5f_bc3f59188e0b.slice/crio-64370d62cf5e7f27c4059220d205f7e2f2acf1517095fcced1be470d932b3319 WatchSource:0}: Error finding container 64370d62cf5e7f27c4059220d205f7e2f2acf1517095fcced1be470d932b3319: Status 404 returned error can't find the container with id 64370d62cf5e7f27c4059220d205f7e2f2acf1517095fcced1be470d932b3319 Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.035373 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b762f743-61e9-4b20-8812-85ab6edb8e04" path="/var/lib/kubelet/pods/b762f743-61e9-4b20-8812-85ab6edb8e04/volumes" Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.036236 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-n6fhd"] Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.036265 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jswzb"] Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.036276 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.060873 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-27tlg"] Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.408995 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:12:38 crc kubenswrapper[4941]: W0307 07:12:38.410769 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3c39296_5d56_4f48_9a56_574e5d590482.slice/crio-67e506e55566ee6f61131a75177417bd379e27bc4e69024276bcac3bad8e8af6 WatchSource:0}: Error finding container 67e506e55566ee6f61131a75177417bd379e27bc4e69024276bcac3bad8e8af6: Status 404 returned error can't find the container with id 67e506e55566ee6f61131a75177417bd379e27bc4e69024276bcac3bad8e8af6 Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.524695 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:12:38 crc kubenswrapper[4941]: W0307 07:12:38.525141 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb4102ed_3089_4b55_a29a_99c09d7243d5.slice/crio-981d1d1d089d8b7b77c9ed43eaa09ba08aa15f5c71de518ea846330764b7a531 WatchSource:0}: Error finding container 981d1d1d089d8b7b77c9ed43eaa09ba08aa15f5c71de518ea846330764b7a531: Status 404 returned error can't find the container with id 981d1d1d089d8b7b77c9ed43eaa09ba08aa15f5c71de518ea846330764b7a531 Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.698661 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.788165 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.802891 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.897717 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3c39296-5d56-4f48-9a56-574e5d590482","Type":"ContainerStarted","Data":"67e506e55566ee6f61131a75177417bd379e27bc4e69024276bcac3bad8e8af6"} Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.901083 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb4102ed-3089-4b55-a29a-99c09d7243d5","Type":"ContainerStarted","Data":"981d1d1d089d8b7b77c9ed43eaa09ba08aa15f5c71de518ea846330764b7a531"} Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.910838 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jswzb" event={"ID":"a2dc5023-3f28-4e34-be5f-bc3f59188e0b","Type":"ContainerStarted","Data":"64370d62cf5e7f27c4059220d205f7e2f2acf1517095fcced1be470d932b3319"} Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.911949 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84812b46-cde1-4da2-9a4d-e0e6013c56fe","Type":"ContainerStarted","Data":"557d5d876fe202f263820e189debca5c7a861c43a860ce78bf0fcbfc77f648d1"} Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.912901 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" event={"ID":"a038eb59-eed0-442b-9076-5e5091511b2b","Type":"ContainerStarted","Data":"372006e51ba12b34f05c867978229560a3d9a3ba7b3ccebad756240704ba7e7e"} Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.913973 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-27tlg" event={"ID":"6bdf20f4-25fe-480f-9d5a-f593b6d9a763","Type":"ContainerStarted","Data":"20a566a61906ae264318e4af1081efebac177558582abbc426cb7c55a855c474"} Mar 07 07:12:38 crc kubenswrapper[4941]: I0307 07:12:38.917523 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n6fhd" event={"ID":"3d89d1d4-04b0-4778-98d7-1cc12db0588b","Type":"ContainerStarted","Data":"f6f10e75b2145c325cdfbb2f7fb58a3081e1cded05afa3510dc280f3d740bd7c"} Mar 07 07:12:39 crc kubenswrapper[4941]: I0307 07:12:39.951226 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3c39296-5d56-4f48-9a56-574e5d590482","Type":"ContainerStarted","Data":"8ce357e1cd334772c2be6a978f309dcad840a2b350acdda7e1dc4b96a6a75d59"} Mar 07 07:12:39 crc kubenswrapper[4941]: I0307 07:12:39.964734 4941 generic.go:334] "Generic (PLEG): container finished" podID="bb809502-d1a2-42e2-b60d-0ac873fdd73b" containerID="b784acf797170216dd6f228a6a432f5a83ab04f39beded13f6350587ab05a77e" exitCode=0 Mar 07 07:12:39 crc kubenswrapper[4941]: I0307 07:12:39.967247 4941 generic.go:334] "Generic (PLEG): container finished" podID="a038eb59-eed0-442b-9076-5e5091511b2b" containerID="cf86e02af468df69b400e7d7d3af3d7126462f978890d7694d091fdee1137954" exitCode=0 Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.015758 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jqlt7" podStartSLOduration=4.015737606 podStartE2EDuration="4.015737606s" podCreationTimestamp="2026-03-07 07:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:12:39.979319042 +0000 UTC m=+1256.931684527" watchObservedRunningTime="2026-03-07 07:12:40.015737606 +0000 UTC m=+1256.968103071" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.034892 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb4102ed-3089-4b55-a29a-99c09d7243d5","Type":"ContainerStarted","Data":"c7baf557491650de48d44b57696ad34e64cbab213133d6bfca54de6fa28cd3b3"} Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.034959 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jqlt7" event={"ID":"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad","Type":"ContainerStarted","Data":"c24fae2d64b62c86483898cb09d66f911efa88a19a00118d0ad7bfa3f022fa48"} Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.034977 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jswzb" event={"ID":"a2dc5023-3f28-4e34-be5f-bc3f59188e0b","Type":"ContainerStarted","Data":"a81ee4ba569fd2814e50a034f3ae5d58e32374cb4584d0ccc23f214dc6519957"} Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.034987 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" event={"ID":"bb809502-d1a2-42e2-b60d-0ac873fdd73b","Type":"ContainerDied","Data":"b784acf797170216dd6f228a6a432f5a83ab04f39beded13f6350587ab05a77e"} Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.034999 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" event={"ID":"a038eb59-eed0-442b-9076-5e5091511b2b","Type":"ContainerDied","Data":"cf86e02af468df69b400e7d7d3af3d7126462f978890d7694d091fdee1137954"} Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.047230 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jswzb" podStartSLOduration=4.047214055 podStartE2EDuration="4.047214055s" podCreationTimestamp="2026-03-07 07:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:12:40.03679283 +0000 UTC m=+1256.989158295" watchObservedRunningTime="2026-03-07 07:12:40.047214055 +0000 UTC m=+1256.999579520" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.401638 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.545232 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-ovsdbserver-sb\") pod \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.545547 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-config\") pod \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.545623 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzsct\" (UniqueName: \"kubernetes.io/projected/bb809502-d1a2-42e2-b60d-0ac873fdd73b-kube-api-access-kzsct\") pod \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.545662 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-ovsdbserver-nb\") pod \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.545692 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-dns-swift-storage-0\") pod \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.545763 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-dns-svc\") pod \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\" (UID: \"bb809502-d1a2-42e2-b60d-0ac873fdd73b\") " Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.553766 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb809502-d1a2-42e2-b60d-0ac873fdd73b-kube-api-access-kzsct" (OuterVolumeSpecName: "kube-api-access-kzsct") pod "bb809502-d1a2-42e2-b60d-0ac873fdd73b" (UID: "bb809502-d1a2-42e2-b60d-0ac873fdd73b"). InnerVolumeSpecName "kube-api-access-kzsct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.575344 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb809502-d1a2-42e2-b60d-0ac873fdd73b" (UID: "bb809502-d1a2-42e2-b60d-0ac873fdd73b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.581817 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb809502-d1a2-42e2-b60d-0ac873fdd73b" (UID: "bb809502-d1a2-42e2-b60d-0ac873fdd73b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.583525 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb809502-d1a2-42e2-b60d-0ac873fdd73b" (UID: "bb809502-d1a2-42e2-b60d-0ac873fdd73b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.591918 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-config" (OuterVolumeSpecName: "config") pod "bb809502-d1a2-42e2-b60d-0ac873fdd73b" (UID: "bb809502-d1a2-42e2-b60d-0ac873fdd73b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.598673 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb809502-d1a2-42e2-b60d-0ac873fdd73b" (UID: "bb809502-d1a2-42e2-b60d-0ac873fdd73b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.649022 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.649062 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.649076 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.649091 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzsct\" (UniqueName: \"kubernetes.io/projected/bb809502-d1a2-42e2-b60d-0ac873fdd73b-kube-api-access-kzsct\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.649102 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.649113 4941 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb809502-d1a2-42e2-b60d-0ac873fdd73b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.995838 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" event={"ID":"bb809502-d1a2-42e2-b60d-0ac873fdd73b","Type":"ContainerDied","Data":"d0a150542df0f9fa467c5484decc64258caa293ea201ca4867de27e68a85bfec"} Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.996123 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67c555b79c-cqhpf" Mar 07 07:12:40 crc kubenswrapper[4941]: I0307 07:12:40.996143 4941 scope.go:117] "RemoveContainer" containerID="b784acf797170216dd6f228a6a432f5a83ab04f39beded13f6350587ab05a77e" Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:40.999887 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" event={"ID":"a038eb59-eed0-442b-9076-5e5091511b2b","Type":"ContainerStarted","Data":"a524cbc4e3f92c137635a3dcc300a6f8e28a2c4ef1de87f52d0017e6c248ac3d"} Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:41.000124 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:41.002513 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3c39296-5d56-4f48-9a56-574e5d590482","Type":"ContainerStarted","Data":"b092a945239ad05f642aa0fc9656ba71ccc52b5799c25d48a865bcc4ac6dcd1a"} Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:41.002621 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c3c39296-5d56-4f48-9a56-574e5d590482" containerName="glance-log" containerID="cri-o://8ce357e1cd334772c2be6a978f309dcad840a2b350acdda7e1dc4b96a6a75d59" gracePeriod=30 Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:41.002850 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c3c39296-5d56-4f48-9a56-574e5d590482" containerName="glance-httpd" containerID="cri-o://b092a945239ad05f642aa0fc9656ba71ccc52b5799c25d48a865bcc4ac6dcd1a" gracePeriod=30 Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:41.010579 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb4102ed-3089-4b55-a29a-99c09d7243d5","Type":"ContainerStarted","Data":"29fdc2d7930458198100975fb20fa151345a97444135f7efcbb5edd48b68847f"} Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:41.010976 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eb4102ed-3089-4b55-a29a-99c09d7243d5" containerName="glance-log" containerID="cri-o://c7baf557491650de48d44b57696ad34e64cbab213133d6bfca54de6fa28cd3b3" gracePeriod=30 Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:41.011136 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eb4102ed-3089-4b55-a29a-99c09d7243d5" containerName="glance-httpd" containerID="cri-o://29fdc2d7930458198100975fb20fa151345a97444135f7efcbb5edd48b68847f" gracePeriod=30 Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:41.029138 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" podStartSLOduration=5.029080085 podStartE2EDuration="5.029080085s" podCreationTimestamp="2026-03-07 07:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:12:41.026100629 +0000 UTC m=+1257.978466104" watchObservedRunningTime="2026-03-07 07:12:41.029080085 +0000 UTC m=+1257.981445550" Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:41.061927 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.061899308 podStartE2EDuration="5.061899308s" podCreationTimestamp="2026-03-07 07:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:12:41.053484264 +0000 UTC m=+1258.005849729" watchObservedRunningTime="2026-03-07 07:12:41.061899308 +0000 UTC m=+1258.014264783" Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:41.088107 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.088087963 podStartE2EDuration="5.088087963s" podCreationTimestamp="2026-03-07 07:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:12:41.076580641 +0000 UTC m=+1258.028946106" watchObservedRunningTime="2026-03-07 07:12:41.088087963 +0000 UTC m=+1258.040453418" Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:41.131307 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67c555b79c-cqhpf"] Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:41.139522 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67c555b79c-cqhpf"] Mar 07 07:12:41 crc kubenswrapper[4941]: E0307 07:12:41.840148 4941 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb4102ed_3089_4b55_a29a_99c09d7243d5.slice/crio-conmon-29fdc2d7930458198100975fb20fa151345a97444135f7efcbb5edd48b68847f.scope\": RecentStats: unable to find data in memory cache]" Mar 07 07:12:41 crc kubenswrapper[4941]: I0307 07:12:41.976293 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb809502-d1a2-42e2-b60d-0ac873fdd73b" path="/var/lib/kubelet/pods/bb809502-d1a2-42e2-b60d-0ac873fdd73b/volumes" Mar 07 07:12:42 crc kubenswrapper[4941]: I0307 07:12:42.028967 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3c39296-5d56-4f48-9a56-574e5d590482" containerID="b092a945239ad05f642aa0fc9656ba71ccc52b5799c25d48a865bcc4ac6dcd1a" exitCode=0 Mar 07 07:12:42 crc kubenswrapper[4941]: I0307 07:12:42.029036 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3c39296-5d56-4f48-9a56-574e5d590482","Type":"ContainerDied","Data":"b092a945239ad05f642aa0fc9656ba71ccc52b5799c25d48a865bcc4ac6dcd1a"} Mar 07 07:12:42 crc kubenswrapper[4941]: I0307 07:12:42.029081 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3c39296-5d56-4f48-9a56-574e5d590482","Type":"ContainerDied","Data":"8ce357e1cd334772c2be6a978f309dcad840a2b350acdda7e1dc4b96a6a75d59"} Mar 07 07:12:42 crc kubenswrapper[4941]: I0307 07:12:42.029052 4941 generic.go:334] "Generic (PLEG): container finished" podID="c3c39296-5d56-4f48-9a56-574e5d590482" containerID="8ce357e1cd334772c2be6a978f309dcad840a2b350acdda7e1dc4b96a6a75d59" exitCode=143 Mar 07 07:12:42 crc kubenswrapper[4941]: I0307 07:12:42.035686 4941 generic.go:334] "Generic (PLEG): container finished" podID="eb4102ed-3089-4b55-a29a-99c09d7243d5" containerID="29fdc2d7930458198100975fb20fa151345a97444135f7efcbb5edd48b68847f" exitCode=0 Mar 07 07:12:42 crc kubenswrapper[4941]: I0307 07:12:42.035718 4941 generic.go:334] "Generic (PLEG): container finished" podID="eb4102ed-3089-4b55-a29a-99c09d7243d5" containerID="c7baf557491650de48d44b57696ad34e64cbab213133d6bfca54de6fa28cd3b3" exitCode=143 Mar 07 07:12:42 crc kubenswrapper[4941]: I0307 07:12:42.036605 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb4102ed-3089-4b55-a29a-99c09d7243d5","Type":"ContainerDied","Data":"29fdc2d7930458198100975fb20fa151345a97444135f7efcbb5edd48b68847f"} Mar 07 07:12:42 crc kubenswrapper[4941]: I0307 07:12:42.036640 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb4102ed-3089-4b55-a29a-99c09d7243d5","Type":"ContainerDied","Data":"c7baf557491650de48d44b57696ad34e64cbab213133d6bfca54de6fa28cd3b3"} Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.440134 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.534600 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-scripts\") pod \"eb4102ed-3089-4b55-a29a-99c09d7243d5\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.534776 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb4102ed-3089-4b55-a29a-99c09d7243d5-httpd-run\") pod \"eb4102ed-3089-4b55-a29a-99c09d7243d5\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.534815 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb4102ed-3089-4b55-a29a-99c09d7243d5-logs\") pod \"eb4102ed-3089-4b55-a29a-99c09d7243d5\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.534911 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-internal-tls-certs\") pod \"eb4102ed-3089-4b55-a29a-99c09d7243d5\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.534955 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"eb4102ed-3089-4b55-a29a-99c09d7243d5\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.534997 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-combined-ca-bundle\") pod \"eb4102ed-3089-4b55-a29a-99c09d7243d5\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.535027 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwv92\" (UniqueName: \"kubernetes.io/projected/eb4102ed-3089-4b55-a29a-99c09d7243d5-kube-api-access-dwv92\") pod \"eb4102ed-3089-4b55-a29a-99c09d7243d5\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.535097 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-config-data\") pod \"eb4102ed-3089-4b55-a29a-99c09d7243d5\" (UID: \"eb4102ed-3089-4b55-a29a-99c09d7243d5\") " Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.535208 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4102ed-3089-4b55-a29a-99c09d7243d5-logs" (OuterVolumeSpecName: "logs") pod "eb4102ed-3089-4b55-a29a-99c09d7243d5" (UID: "eb4102ed-3089-4b55-a29a-99c09d7243d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.535299 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4102ed-3089-4b55-a29a-99c09d7243d5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eb4102ed-3089-4b55-a29a-99c09d7243d5" (UID: "eb4102ed-3089-4b55-a29a-99c09d7243d5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.539038 4941 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb4102ed-3089-4b55-a29a-99c09d7243d5-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.539065 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb4102ed-3089-4b55-a29a-99c09d7243d5-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.540219 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4102ed-3089-4b55-a29a-99c09d7243d5-kube-api-access-dwv92" (OuterVolumeSpecName: "kube-api-access-dwv92") pod "eb4102ed-3089-4b55-a29a-99c09d7243d5" (UID: "eb4102ed-3089-4b55-a29a-99c09d7243d5"). InnerVolumeSpecName "kube-api-access-dwv92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.540789 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "eb4102ed-3089-4b55-a29a-99c09d7243d5" (UID: "eb4102ed-3089-4b55-a29a-99c09d7243d5"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.549796 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-scripts" (OuterVolumeSpecName: "scripts") pod "eb4102ed-3089-4b55-a29a-99c09d7243d5" (UID: "eb4102ed-3089-4b55-a29a-99c09d7243d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.570275 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb4102ed-3089-4b55-a29a-99c09d7243d5" (UID: "eb4102ed-3089-4b55-a29a-99c09d7243d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.593573 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-config-data" (OuterVolumeSpecName: "config-data") pod "eb4102ed-3089-4b55-a29a-99c09d7243d5" (UID: "eb4102ed-3089-4b55-a29a-99c09d7243d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.602015 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eb4102ed-3089-4b55-a29a-99c09d7243d5" (UID: "eb4102ed-3089-4b55-a29a-99c09d7243d5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.641180 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.641214 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.641222 4941 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.641254 4941 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.641264 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4102ed-3089-4b55-a29a-99c09d7243d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.641274 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwv92\" (UniqueName: \"kubernetes.io/projected/eb4102ed-3089-4b55-a29a-99c09d7243d5-kube-api-access-dwv92\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.665190 4941 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 07 07:12:44 crc kubenswrapper[4941]: I0307 07:12:44.742720 4941 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.070856 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb4102ed-3089-4b55-a29a-99c09d7243d5","Type":"ContainerDied","Data":"981d1d1d089d8b7b77c9ed43eaa09ba08aa15f5c71de518ea846330764b7a531"} Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.070908 4941 scope.go:117] "RemoveContainer" containerID="29fdc2d7930458198100975fb20fa151345a97444135f7efcbb5edd48b68847f" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.071198 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.094154 4941 scope.go:117] "RemoveContainer" containerID="c7baf557491650de48d44b57696ad34e64cbab213133d6bfca54de6fa28cd3b3" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.121582 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.143097 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.160357 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:12:45 crc kubenswrapper[4941]: E0307 07:12:45.160786 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b762f743-61e9-4b20-8812-85ab6edb8e04" containerName="init" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.160810 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b762f743-61e9-4b20-8812-85ab6edb8e04" containerName="init" Mar 07 07:12:45 crc kubenswrapper[4941]: E0307 07:12:45.160828 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb809502-d1a2-42e2-b60d-0ac873fdd73b" containerName="init" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.160836 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb809502-d1a2-42e2-b60d-0ac873fdd73b" containerName="init" Mar 07 07:12:45 crc kubenswrapper[4941]: E0307 07:12:45.160848 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4102ed-3089-4b55-a29a-99c09d7243d5" containerName="glance-log" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.160855 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4102ed-3089-4b55-a29a-99c09d7243d5" containerName="glance-log" Mar 07 07:12:45 crc kubenswrapper[4941]: E0307 07:12:45.160876 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b762f743-61e9-4b20-8812-85ab6edb8e04" containerName="dnsmasq-dns" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.160883 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b762f743-61e9-4b20-8812-85ab6edb8e04" containerName="dnsmasq-dns" Mar 07 07:12:45 crc kubenswrapper[4941]: E0307 07:12:45.160901 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4102ed-3089-4b55-a29a-99c09d7243d5" containerName="glance-httpd" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.160907 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4102ed-3089-4b55-a29a-99c09d7243d5" containerName="glance-httpd" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.161090 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4102ed-3089-4b55-a29a-99c09d7243d5" containerName="glance-log" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.161104 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb809502-d1a2-42e2-b60d-0ac873fdd73b" containerName="init" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.161120 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="b762f743-61e9-4b20-8812-85ab6edb8e04" containerName="dnsmasq-dns" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.161125 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4102ed-3089-4b55-a29a-99c09d7243d5" containerName="glance-httpd" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.161933 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.162095 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.165442 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.165497 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.252679 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e00c9299-d657-4baa-8381-feb1a099f6f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.252753 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6zgt\" (UniqueName: \"kubernetes.io/projected/e00c9299-d657-4baa-8381-feb1a099f6f3-kube-api-access-r6zgt\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.252875 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.252944 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.253042 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.253079 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00c9299-d657-4baa-8381-feb1a099f6f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.253122 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.253192 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.354702 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e00c9299-d657-4baa-8381-feb1a099f6f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.354775 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6zgt\" (UniqueName: \"kubernetes.io/projected/e00c9299-d657-4baa-8381-feb1a099f6f3-kube-api-access-r6zgt\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.354799 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.354820 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.354856 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.354876 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00c9299-d657-4baa-8381-feb1a099f6f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.355851 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.355875 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e00c9299-d657-4baa-8381-feb1a099f6f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.355882 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00c9299-d657-4baa-8381-feb1a099f6f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.355889 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.355210 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.359169 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.360533 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.361969 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.372518 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6zgt\" (UniqueName: \"kubernetes.io/projected/e00c9299-d657-4baa-8381-feb1a099f6f3-kube-api-access-r6zgt\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.372823 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.396758 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.487963 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.679415 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.765557 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-public-tls-certs\") pod \"c3c39296-5d56-4f48-9a56-574e5d590482\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.765629 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76vjb\" (UniqueName: \"kubernetes.io/projected/c3c39296-5d56-4f48-9a56-574e5d590482-kube-api-access-76vjb\") pod \"c3c39296-5d56-4f48-9a56-574e5d590482\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.765686 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"c3c39296-5d56-4f48-9a56-574e5d590482\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.765724 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-scripts\") pod \"c3c39296-5d56-4f48-9a56-574e5d590482\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.765770 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3c39296-5d56-4f48-9a56-574e5d590482-logs\") pod \"c3c39296-5d56-4f48-9a56-574e5d590482\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.765823 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-config-data\") pod \"c3c39296-5d56-4f48-9a56-574e5d590482\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.765901 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-combined-ca-bundle\") pod \"c3c39296-5d56-4f48-9a56-574e5d590482\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.765981 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3c39296-5d56-4f48-9a56-574e5d590482-httpd-run\") pod \"c3c39296-5d56-4f48-9a56-574e5d590482\" (UID: \"c3c39296-5d56-4f48-9a56-574e5d590482\") " Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.767973 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3c39296-5d56-4f48-9a56-574e5d590482-logs" (OuterVolumeSpecName: "logs") pod "c3c39296-5d56-4f48-9a56-574e5d590482" (UID: "c3c39296-5d56-4f48-9a56-574e5d590482"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.771268 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c39296-5d56-4f48-9a56-574e5d590482-kube-api-access-76vjb" (OuterVolumeSpecName: "kube-api-access-76vjb") pod "c3c39296-5d56-4f48-9a56-574e5d590482" (UID: "c3c39296-5d56-4f48-9a56-574e5d590482"). InnerVolumeSpecName "kube-api-access-76vjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.771420 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3c39296-5d56-4f48-9a56-574e5d590482-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c3c39296-5d56-4f48-9a56-574e5d590482" (UID: "c3c39296-5d56-4f48-9a56-574e5d590482"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.774126 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-scripts" (OuterVolumeSpecName: "scripts") pod "c3c39296-5d56-4f48-9a56-574e5d590482" (UID: "c3c39296-5d56-4f48-9a56-574e5d590482"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.775532 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "c3c39296-5d56-4f48-9a56-574e5d590482" (UID: "c3c39296-5d56-4f48-9a56-574e5d590482"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.794671 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3c39296-5d56-4f48-9a56-574e5d590482" (UID: "c3c39296-5d56-4f48-9a56-574e5d590482"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.827578 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c3c39296-5d56-4f48-9a56-574e5d590482" (UID: "c3c39296-5d56-4f48-9a56-574e5d590482"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.835052 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-config-data" (OuterVolumeSpecName: "config-data") pod "c3c39296-5d56-4f48-9a56-574e5d590482" (UID: "c3c39296-5d56-4f48-9a56-574e5d590482"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.868342 4941 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3c39296-5d56-4f48-9a56-574e5d590482-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.868380 4941 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.868398 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76vjb\" (UniqueName: \"kubernetes.io/projected/c3c39296-5d56-4f48-9a56-574e5d590482-kube-api-access-76vjb\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.868458 4941 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.868471 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.868486 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3c39296-5d56-4f48-9a56-574e5d590482-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.868500 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.868511 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c39296-5d56-4f48-9a56-574e5d590482-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.890911 4941 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.969707 4941 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:45 crc kubenswrapper[4941]: I0307 07:12:45.972457 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb4102ed-3089-4b55-a29a-99c09d7243d5" path="/var/lib/kubelet/pods/eb4102ed-3089-4b55-a29a-99c09d7243d5/volumes" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.053262 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:12:46 crc kubenswrapper[4941]: W0307 07:12:46.057330 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode00c9299_d657_4baa_8381_feb1a099f6f3.slice/crio-60d5da6a07923d1f3a807878bf7c9e2417646acdb587caca76b524914114106e WatchSource:0}: Error finding container 60d5da6a07923d1f3a807878bf7c9e2417646acdb587caca76b524914114106e: Status 404 returned error can't find the container with id 60d5da6a07923d1f3a807878bf7c9e2417646acdb587caca76b524914114106e Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.079081 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e00c9299-d657-4baa-8381-feb1a099f6f3","Type":"ContainerStarted","Data":"60d5da6a07923d1f3a807878bf7c9e2417646acdb587caca76b524914114106e"} Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.081617 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.081605 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3c39296-5d56-4f48-9a56-574e5d590482","Type":"ContainerDied","Data":"67e506e55566ee6f61131a75177417bd379e27bc4e69024276bcac3bad8e8af6"} Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.081894 4941 scope.go:117] "RemoveContainer" containerID="b092a945239ad05f642aa0fc9656ba71ccc52b5799c25d48a865bcc4ac6dcd1a" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.112056 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.114673 4941 scope.go:117] "RemoveContainer" containerID="8ce357e1cd334772c2be6a978f309dcad840a2b350acdda7e1dc4b96a6a75d59" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.135387 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.143442 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:12:46 crc kubenswrapper[4941]: E0307 07:12:46.143886 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c39296-5d56-4f48-9a56-574e5d590482" containerName="glance-log" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.143911 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c39296-5d56-4f48-9a56-574e5d590482" containerName="glance-log" Mar 07 07:12:46 crc kubenswrapper[4941]: E0307 07:12:46.143928 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c39296-5d56-4f48-9a56-574e5d590482" containerName="glance-httpd" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.143937 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c39296-5d56-4f48-9a56-574e5d590482" containerName="glance-httpd" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.144891 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3c39296-5d56-4f48-9a56-574e5d590482" containerName="glance-log" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.144937 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3c39296-5d56-4f48-9a56-574e5d590482" containerName="glance-httpd" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.147853 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.151651 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.151935 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.180669 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.277000 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-scripts\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.277047 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.277096 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df120d9b-6b3c-401e-9847-5799a00ccba4-logs\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.277168 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df120d9b-6b3c-401e-9847-5799a00ccba4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.277208 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kp68\" (UniqueName: \"kubernetes.io/projected/df120d9b-6b3c-401e-9847-5799a00ccba4-kube-api-access-6kp68\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.277232 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.277296 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.277333 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-config-data\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.379636 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.379698 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-config-data\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.379729 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-scripts\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.379749 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.379786 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df120d9b-6b3c-401e-9847-5799a00ccba4-logs\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.379863 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df120d9b-6b3c-401e-9847-5799a00ccba4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.379898 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kp68\" (UniqueName: \"kubernetes.io/projected/df120d9b-6b3c-401e-9847-5799a00ccba4-kube-api-access-6kp68\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.379920 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.380313 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.383087 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df120d9b-6b3c-401e-9847-5799a00ccba4-logs\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.384042 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df120d9b-6b3c-401e-9847-5799a00ccba4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.386691 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.389661 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-config-data\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.391063 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.395348 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-scripts\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.399135 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kp68\" (UniqueName: \"kubernetes.io/projected/df120d9b-6b3c-401e-9847-5799a00ccba4-kube-api-access-6kp68\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.408841 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " pod="openstack/glance-default-external-api-0" Mar 07 07:12:46 crc kubenswrapper[4941]: I0307 07:12:46.465642 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 07 07:12:47 crc kubenswrapper[4941]: I0307 07:12:47.026239 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:12:47 crc kubenswrapper[4941]: I0307 07:12:47.096485 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d74f8fb89-xvqwk"] Mar 07 07:12:47 crc kubenswrapper[4941]: I0307 07:12:47.096959 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" podUID="69eaf595-4875-440a-8b7f-b9dd8787c325" containerName="dnsmasq-dns" containerID="cri-o://f860858c174a197ff23e9f8272e738e7612917a538c17b2314121259cb063e02" gracePeriod=10 Mar 07 07:12:47 crc kubenswrapper[4941]: I0307 07:12:47.968339 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3c39296-5d56-4f48-9a56-574e5d590482" path="/var/lib/kubelet/pods/c3c39296-5d56-4f48-9a56-574e5d590482/volumes" Mar 07 07:12:50 crc kubenswrapper[4941]: I0307 07:12:50.143886 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e00c9299-d657-4baa-8381-feb1a099f6f3","Type":"ContainerStarted","Data":"00b8ba1aafedcbae78e907fe0a3b46e0598930393958c65057796c47d5f40bd3"} Mar 07 07:12:50 crc kubenswrapper[4941]: I0307 07:12:50.195642 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" podUID="69eaf595-4875-440a-8b7f-b9dd8787c325" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: connect: connection refused" Mar 07 07:12:51 crc kubenswrapper[4941]: I0307 07:12:51.153999 4941 generic.go:334] "Generic (PLEG): container finished" podID="69eaf595-4875-440a-8b7f-b9dd8787c325" containerID="f860858c174a197ff23e9f8272e738e7612917a538c17b2314121259cb063e02" exitCode=0 Mar 07 07:12:51 crc kubenswrapper[4941]: I0307 07:12:51.154093 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" event={"ID":"69eaf595-4875-440a-8b7f-b9dd8787c325","Type":"ContainerDied","Data":"f860858c174a197ff23e9f8272e738e7612917a538c17b2314121259cb063e02"} Mar 07 07:12:53 crc kubenswrapper[4941]: I0307 07:12:53.170111 4941 generic.go:334] "Generic (PLEG): container finished" podID="9605bd6a-d443-4b8f-b785-ac3a7f5d1fad" containerID="c24fae2d64b62c86483898cb09d66f911efa88a19a00118d0ad7bfa3f022fa48" exitCode=0 Mar 07 07:12:53 crc kubenswrapper[4941]: I0307 07:12:53.170213 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jqlt7" event={"ID":"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad","Type":"ContainerDied","Data":"c24fae2d64b62c86483898cb09d66f911efa88a19a00118d0ad7bfa3f022fa48"} Mar 07 07:12:55 crc kubenswrapper[4941]: I0307 07:12:55.195999 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" podUID="69eaf595-4875-440a-8b7f-b9dd8787c325" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: connect: connection refused" Mar 07 07:12:59 crc kubenswrapper[4941]: E0307 07:12:59.096514 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 07 07:12:59 crc kubenswrapper[4941]: E0307 07:12:59.097219 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lpjls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-k7f6s_openstack(0e0538f9-8d7c-40cf-bc98-a165a41d1bf6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 07 07:12:59 crc kubenswrapper[4941]: E0307 07:12:59.098674 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-k7f6s" podUID="0e0538f9-8d7c-40cf-bc98-a165a41d1bf6" Mar 07 07:12:59 crc kubenswrapper[4941]: E0307 07:12:59.228534 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-k7f6s" podUID="0e0538f9-8d7c-40cf-bc98-a165a41d1bf6" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.195798 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" podUID="69eaf595-4875-440a-8b7f-b9dd8787c325" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: connect: connection refused" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.195945 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:13:00 crc kubenswrapper[4941]: E0307 07:13:00.515769 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:11d4431e4af1735fbd9d425596f81dd62b0ca934d84d7c4e67902656c2b688d3" Mar 07 07:13:00 crc kubenswrapper[4941]: E0307 07:13:00.516247 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:11d4431e4af1735fbd9d425596f81dd62b0ca934d84d7c4e67902656c2b688d3,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ntjlt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-27tlg_openstack(6bdf20f4-25fe-480f-9d5a-f593b6d9a763): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 07 07:13:00 crc kubenswrapper[4941]: E0307 07:13:00.517381 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-27tlg" podUID="6bdf20f4-25fe-480f-9d5a-f593b6d9a763" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.610585 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.651833 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-combined-ca-bundle\") pod \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.652036 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-credential-keys\") pod \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.652105 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-scripts\") pod \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.652146 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-fernet-keys\") pod \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.652769 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-config-data\") pod \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.652888 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb9ks\" (UniqueName: \"kubernetes.io/projected/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-kube-api-access-sb9ks\") pod \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\" (UID: \"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad\") " Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.660548 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9605bd6a-d443-4b8f-b785-ac3a7f5d1fad" (UID: "9605bd6a-d443-4b8f-b785-ac3a7f5d1fad"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.660573 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9605bd6a-d443-4b8f-b785-ac3a7f5d1fad" (UID: "9605bd6a-d443-4b8f-b785-ac3a7f5d1fad"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.673628 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-scripts" (OuterVolumeSpecName: "scripts") pod "9605bd6a-d443-4b8f-b785-ac3a7f5d1fad" (UID: "9605bd6a-d443-4b8f-b785-ac3a7f5d1fad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.685575 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-kube-api-access-sb9ks" (OuterVolumeSpecName: "kube-api-access-sb9ks") pod "9605bd6a-d443-4b8f-b785-ac3a7f5d1fad" (UID: "9605bd6a-d443-4b8f-b785-ac3a7f5d1fad"). InnerVolumeSpecName "kube-api-access-sb9ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.696939 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-config-data" (OuterVolumeSpecName: "config-data") pod "9605bd6a-d443-4b8f-b785-ac3a7f5d1fad" (UID: "9605bd6a-d443-4b8f-b785-ac3a7f5d1fad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.706059 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9605bd6a-d443-4b8f-b785-ac3a7f5d1fad" (UID: "9605bd6a-d443-4b8f-b785-ac3a7f5d1fad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.755982 4941 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.756334 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.756344 4941 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.756368 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.756377 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb9ks\" (UniqueName: \"kubernetes.io/projected/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-kube-api-access-sb9ks\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:00 crc kubenswrapper[4941]: I0307 07:13:00.756389 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.253899 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jqlt7" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.253892 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jqlt7" event={"ID":"9605bd6a-d443-4b8f-b785-ac3a7f5d1fad","Type":"ContainerDied","Data":"a939d5a60382d4d0ae65153ddf26bff83616488dd1b6ee21ba8a2eae2b3cab17"} Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.253950 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a939d5a60382d4d0ae65153ddf26bff83616488dd1b6ee21ba8a2eae2b3cab17" Mar 07 07:13:01 crc kubenswrapper[4941]: E0307 07:13:01.255996 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:11d4431e4af1735fbd9d425596f81dd62b0ca934d84d7c4e67902656c2b688d3\\\"\"" pod="openstack/placement-db-sync-27tlg" podUID="6bdf20f4-25fe-480f-9d5a-f593b6d9a763" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.756376 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jqlt7"] Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.763257 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jqlt7"] Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.795387 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-h62cn"] Mar 07 07:13:01 crc kubenswrapper[4941]: E0307 07:13:01.795788 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9605bd6a-d443-4b8f-b785-ac3a7f5d1fad" containerName="keystone-bootstrap" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.795820 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="9605bd6a-d443-4b8f-b785-ac3a7f5d1fad" containerName="keystone-bootstrap" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.795997 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="9605bd6a-d443-4b8f-b785-ac3a7f5d1fad" containerName="keystone-bootstrap" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.796586 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.803197 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-h62cn"] Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.826993 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.827066 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.827169 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.827209 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.827892 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lf9xv" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.902358 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-config-data\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.902528 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-combined-ca-bundle\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.902570 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-credential-keys\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.902663 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-scripts\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.902702 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-fernet-keys\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.902773 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz26r\" (UniqueName: \"kubernetes.io/projected/4f5d8489-0104-4980-9f26-1330336ef7f0-kube-api-access-xz26r\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:01 crc kubenswrapper[4941]: I0307 07:13:01.967199 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9605bd6a-d443-4b8f-b785-ac3a7f5d1fad" path="/var/lib/kubelet/pods/9605bd6a-d443-4b8f-b785-ac3a7f5d1fad/volumes" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.005391 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz26r\" (UniqueName: \"kubernetes.io/projected/4f5d8489-0104-4980-9f26-1330336ef7f0-kube-api-access-xz26r\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.005612 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-config-data\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.005714 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-combined-ca-bundle\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.005744 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-credential-keys\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.005828 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-scripts\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.005852 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-fernet-keys\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.011795 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-scripts\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.012216 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-combined-ca-bundle\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.012488 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-fernet-keys\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.012544 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-config-data\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.013729 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-credential-keys\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.025682 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz26r\" (UniqueName: \"kubernetes.io/projected/4f5d8489-0104-4980-9f26-1330336ef7f0-kube-api-access-xz26r\") pod \"keystone-bootstrap-h62cn\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.174818 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:02 crc kubenswrapper[4941]: I0307 07:13:02.964744 4941 scope.go:117] "RemoveContainer" containerID="6558c83382c07dad799d650a90899b82c4b381543d3e298c3a2126b0ed70b318" Mar 07 07:13:03 crc kubenswrapper[4941]: E0307 07:13:03.346570 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:57dfeeb1cb430ed73e6db471592cfb1a5f25d3d5c083f82d4a676f936978be81" Mar 07 07:13:03 crc kubenswrapper[4941]: E0307 07:13:03.346968 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:57dfeeb1cb430ed73e6db471592cfb1a5f25d3d5c083f82d4a676f936978be81,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n696h564h5d5h6dh5c7h54h78hf6h6ch54bh654h5cbh5bhf6hd5h64dh5dfh579hb5h65bh96h545h5f9hcdhbh75h5bch74hc4h5d7h58fh5f5q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-756qc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(84812b46-cde1-4da2-9a4d-e0e6013c56fe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.431072 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.530045 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-ovsdbserver-sb\") pod \"69eaf595-4875-440a-8b7f-b9dd8787c325\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.530129 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-ovsdbserver-nb\") pod \"69eaf595-4875-440a-8b7f-b9dd8787c325\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.530269 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-config\") pod \"69eaf595-4875-440a-8b7f-b9dd8787c325\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.530351 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h22bl\" (UniqueName: \"kubernetes.io/projected/69eaf595-4875-440a-8b7f-b9dd8787c325-kube-api-access-h22bl\") pod \"69eaf595-4875-440a-8b7f-b9dd8787c325\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.530550 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-dns-swift-storage-0\") pod \"69eaf595-4875-440a-8b7f-b9dd8787c325\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.530598 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-dns-svc\") pod \"69eaf595-4875-440a-8b7f-b9dd8787c325\" (UID: \"69eaf595-4875-440a-8b7f-b9dd8787c325\") " Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.551694 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69eaf595-4875-440a-8b7f-b9dd8787c325-kube-api-access-h22bl" (OuterVolumeSpecName: "kube-api-access-h22bl") pod "69eaf595-4875-440a-8b7f-b9dd8787c325" (UID: "69eaf595-4875-440a-8b7f-b9dd8787c325"). InnerVolumeSpecName "kube-api-access-h22bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.579103 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69eaf595-4875-440a-8b7f-b9dd8787c325" (UID: "69eaf595-4875-440a-8b7f-b9dd8787c325"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.582659 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69eaf595-4875-440a-8b7f-b9dd8787c325" (UID: "69eaf595-4875-440a-8b7f-b9dd8787c325"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.590144 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "69eaf595-4875-440a-8b7f-b9dd8787c325" (UID: "69eaf595-4875-440a-8b7f-b9dd8787c325"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.611749 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69eaf595-4875-440a-8b7f-b9dd8787c325" (UID: "69eaf595-4875-440a-8b7f-b9dd8787c325"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.612996 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-config" (OuterVolumeSpecName: "config") pod "69eaf595-4875-440a-8b7f-b9dd8787c325" (UID: "69eaf595-4875-440a-8b7f-b9dd8787c325"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.632631 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.632683 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h22bl\" (UniqueName: \"kubernetes.io/projected/69eaf595-4875-440a-8b7f-b9dd8787c325-kube-api-access-h22bl\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.632696 4941 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.632705 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.632714 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:03 crc kubenswrapper[4941]: I0307 07:13:03.632722 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69eaf595-4875-440a-8b7f-b9dd8787c325-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:04 crc kubenswrapper[4941]: I0307 07:13:04.301769 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" event={"ID":"69eaf595-4875-440a-8b7f-b9dd8787c325","Type":"ContainerDied","Data":"3e8d6e45416e833d7f6755e9a565b39306ce55eaf003c837f502ac88d388423f"} Mar 07 07:13:04 crc kubenswrapper[4941]: I0307 07:13:04.301819 4941 scope.go:117] "RemoveContainer" containerID="f860858c174a197ff23e9f8272e738e7612917a538c17b2314121259cb063e02" Mar 07 07:13:04 crc kubenswrapper[4941]: I0307 07:13:04.301967 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74f8fb89-xvqwk" Mar 07 07:13:04 crc kubenswrapper[4941]: I0307 07:13:04.328638 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d74f8fb89-xvqwk"] Mar 07 07:13:04 crc kubenswrapper[4941]: I0307 07:13:04.339155 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d74f8fb89-xvqwk"] Mar 07 07:13:04 crc kubenswrapper[4941]: I0307 07:13:04.442635 4941 scope.go:117] "RemoveContainer" containerID="2e20a5e1f6ab7a40805e0646f13c4e25de495dbd25d24f8b5ca7b57c07fb052a" Mar 07 07:13:04 crc kubenswrapper[4941]: E0307 07:13:04.689888 4941 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d" Mar 07 07:13:04 crc kubenswrapper[4941]: E0307 07:13:04.690328 4941 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nn56w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-n6fhd_openstack(3d89d1d4-04b0-4778-98d7-1cc12db0588b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 07 07:13:04 crc kubenswrapper[4941]: I0307 07:13:04.690900 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-h62cn"] Mar 07 07:13:04 crc kubenswrapper[4941]: E0307 07:13:04.692608 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-n6fhd" podUID="3d89d1d4-04b0-4778-98d7-1cc12db0588b" Mar 07 07:13:04 crc kubenswrapper[4941]: W0307 07:13:04.694916 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f5d8489_0104_4980_9f26_1330336ef7f0.slice/crio-6aed1af076d2b0b0a36a2c7206087b9a4c25d9bcb20414568a34c23b77765636 WatchSource:0}: Error finding container 6aed1af076d2b0b0a36a2c7206087b9a4c25d9bcb20414568a34c23b77765636: Status 404 returned error can't find the container with id 6aed1af076d2b0b0a36a2c7206087b9a4c25d9bcb20414568a34c23b77765636 Mar 07 07:13:04 crc kubenswrapper[4941]: I0307 07:13:04.855216 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:13:04 crc kubenswrapper[4941]: W0307 07:13:04.859561 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf120d9b_6b3c_401e_9847_5799a00ccba4.slice/crio-af029582dc89509ce621f7f9fa4c7eba8871dd48fb81a128ce473d1dd677e82a WatchSource:0}: Error finding container af029582dc89509ce621f7f9fa4c7eba8871dd48fb81a128ce473d1dd677e82a: Status 404 returned error can't find the container with id af029582dc89509ce621f7f9fa4c7eba8871dd48fb81a128ce473d1dd677e82a Mar 07 07:13:05 crc kubenswrapper[4941]: I0307 07:13:05.314461 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h62cn" event={"ID":"4f5d8489-0104-4980-9f26-1330336ef7f0","Type":"ContainerStarted","Data":"ed6ec49be7f1a8845083f19ad0eb4fcb07efc1ddd0b25370c3acd52ac6a40adb"} Mar 07 07:13:05 crc kubenswrapper[4941]: I0307 07:13:05.315858 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h62cn" event={"ID":"4f5d8489-0104-4980-9f26-1330336ef7f0","Type":"ContainerStarted","Data":"6aed1af076d2b0b0a36a2c7206087b9a4c25d9bcb20414568a34c23b77765636"} Mar 07 07:13:05 crc kubenswrapper[4941]: I0307 07:13:05.318724 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df120d9b-6b3c-401e-9847-5799a00ccba4","Type":"ContainerStarted","Data":"af029582dc89509ce621f7f9fa4c7eba8871dd48fb81a128ce473d1dd677e82a"} Mar 07 07:13:05 crc kubenswrapper[4941]: I0307 07:13:05.324044 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e00c9299-d657-4baa-8381-feb1a099f6f3","Type":"ContainerStarted","Data":"34f9181ce1397e6679e1fcdcec34b8497b19db5025642f91b4b8ab393ba9d654"} Mar 07 07:13:05 crc kubenswrapper[4941]: E0307 07:13:05.326872 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d\\\"\"" pod="openstack/barbican-db-sync-n6fhd" podUID="3d89d1d4-04b0-4778-98d7-1cc12db0588b" Mar 07 07:13:05 crc kubenswrapper[4941]: I0307 07:13:05.341899 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-h62cn" podStartSLOduration=4.341877824 podStartE2EDuration="4.341877824s" podCreationTimestamp="2026-03-07 07:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:05.329611613 +0000 UTC m=+1282.281977078" watchObservedRunningTime="2026-03-07 07:13:05.341877824 +0000 UTC m=+1282.294243289" Mar 07 07:13:05 crc kubenswrapper[4941]: I0307 07:13:05.488988 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 07 07:13:05 crc kubenswrapper[4941]: I0307 07:13:05.489041 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 07 07:13:05 crc kubenswrapper[4941]: I0307 07:13:05.532020 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 07 07:13:05 crc kubenswrapper[4941]: I0307 07:13:05.546147 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 07 07:13:05 crc kubenswrapper[4941]: I0307 07:13:05.555914 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=20.555900526 podStartE2EDuration="20.555900526s" podCreationTimestamp="2026-03-07 07:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:05.380888664 +0000 UTC m=+1282.333254129" watchObservedRunningTime="2026-03-07 07:13:05.555900526 +0000 UTC m=+1282.508265991" Mar 07 07:13:05 crc kubenswrapper[4941]: I0307 07:13:05.968591 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69eaf595-4875-440a-8b7f-b9dd8787c325" path="/var/lib/kubelet/pods/69eaf595-4875-440a-8b7f-b9dd8787c325/volumes" Mar 07 07:13:06 crc kubenswrapper[4941]: I0307 07:13:06.334922 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df120d9b-6b3c-401e-9847-5799a00ccba4","Type":"ContainerStarted","Data":"ba7162468c4c70ab71e9a28be05ef6c3bf697ae5de5b59603a39b7e5b0531b86"} Mar 07 07:13:06 crc kubenswrapper[4941]: I0307 07:13:06.335308 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df120d9b-6b3c-401e-9847-5799a00ccba4","Type":"ContainerStarted","Data":"548daa53b60a430ed9122c44f5914f8f99a73a64e60a05630135d619b42368a0"} Mar 07 07:13:06 crc kubenswrapper[4941]: I0307 07:13:06.338251 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84812b46-cde1-4da2-9a4d-e0e6013c56fe","Type":"ContainerStarted","Data":"e4bcb35ecd4b1cf4d395e9040c63991738ac3a9b1b0e5cec3a66ee4d743594b2"} Mar 07 07:13:06 crc kubenswrapper[4941]: I0307 07:13:06.340530 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 07 07:13:06 crc kubenswrapper[4941]: I0307 07:13:06.340551 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 07 07:13:06 crc kubenswrapper[4941]: I0307 07:13:06.362338 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=20.362324724 podStartE2EDuration="20.362324724s" podCreationTimestamp="2026-03-07 07:12:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:06.359835681 +0000 UTC m=+1283.312201146" watchObservedRunningTime="2026-03-07 07:13:06.362324724 +0000 UTC m=+1283.314690189" Mar 07 07:13:06 crc kubenswrapper[4941]: I0307 07:13:06.466322 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 07 07:13:06 crc kubenswrapper[4941]: I0307 07:13:06.466628 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 07 07:13:06 crc kubenswrapper[4941]: I0307 07:13:06.501048 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 07 07:13:06 crc kubenswrapper[4941]: I0307 07:13:06.529747 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 07 07:13:07 crc kubenswrapper[4941]: I0307 07:13:07.350300 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 07 07:13:07 crc kubenswrapper[4941]: I0307 07:13:07.351226 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 07 07:13:08 crc kubenswrapper[4941]: I0307 07:13:08.311454 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 07 07:13:08 crc kubenswrapper[4941]: I0307 07:13:08.359145 4941 generic.go:334] "Generic (PLEG): container finished" podID="4f5d8489-0104-4980-9f26-1330336ef7f0" containerID="ed6ec49be7f1a8845083f19ad0eb4fcb07efc1ddd0b25370c3acd52ac6a40adb" exitCode=0 Mar 07 07:13:08 crc kubenswrapper[4941]: I0307 07:13:08.359230 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h62cn" event={"ID":"4f5d8489-0104-4980-9f26-1330336ef7f0","Type":"ContainerDied","Data":"ed6ec49be7f1a8845083f19ad0eb4fcb07efc1ddd0b25370c3acd52ac6a40adb"} Mar 07 07:13:08 crc kubenswrapper[4941]: I0307 07:13:08.361455 4941 generic.go:334] "Generic (PLEG): container finished" podID="a2dc5023-3f28-4e34-be5f-bc3f59188e0b" containerID="a81ee4ba569fd2814e50a034f3ae5d58e32374cb4584d0ccc23f214dc6519957" exitCode=0 Mar 07 07:13:08 crc kubenswrapper[4941]: I0307 07:13:08.362334 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jswzb" event={"ID":"a2dc5023-3f28-4e34-be5f-bc3f59188e0b","Type":"ContainerDied","Data":"a81ee4ba569fd2814e50a034f3ae5d58e32374cb4584d0ccc23f214dc6519957"} Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.239975 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.324227 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jswzb" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.370463 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.388940 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jswzb" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.389302 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jswzb" event={"ID":"a2dc5023-3f28-4e34-be5f-bc3f59188e0b","Type":"ContainerDied","Data":"64370d62cf5e7f27c4059220d205f7e2f2acf1517095fcced1be470d932b3319"} Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.389336 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64370d62cf5e7f27c4059220d205f7e2f2acf1517095fcced1be470d932b3319" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.395961 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h62cn" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.395947 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h62cn" event={"ID":"4f5d8489-0104-4980-9f26-1330336ef7f0","Type":"ContainerDied","Data":"6aed1af076d2b0b0a36a2c7206087b9a4c25d9bcb20414568a34c23b77765636"} Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.396000 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aed1af076d2b0b0a36a2c7206087b9a4c25d9bcb20414568a34c23b77765636" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.468815 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz26r\" (UniqueName: \"kubernetes.io/projected/4f5d8489-0104-4980-9f26-1330336ef7f0-kube-api-access-xz26r\") pod \"4f5d8489-0104-4980-9f26-1330336ef7f0\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.468890 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-config-data\") pod \"4f5d8489-0104-4980-9f26-1330336ef7f0\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.468915 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-combined-ca-bundle\") pod \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\" (UID: \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\") " Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.468956 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-config\") pod \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\" (UID: \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\") " Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.468979 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-scripts\") pod \"4f5d8489-0104-4980-9f26-1330336ef7f0\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.469052 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-combined-ca-bundle\") pod \"4f5d8489-0104-4980-9f26-1330336ef7f0\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.469081 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-credential-keys\") pod \"4f5d8489-0104-4980-9f26-1330336ef7f0\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.469100 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl6rp\" (UniqueName: \"kubernetes.io/projected/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-kube-api-access-jl6rp\") pod \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\" (UID: \"a2dc5023-3f28-4e34-be5f-bc3f59188e0b\") " Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.469130 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-fernet-keys\") pod \"4f5d8489-0104-4980-9f26-1330336ef7f0\" (UID: \"4f5d8489-0104-4980-9f26-1330336ef7f0\") " Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.481092 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4f5d8489-0104-4980-9f26-1330336ef7f0" (UID: "4f5d8489-0104-4980-9f26-1330336ef7f0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.487650 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-scripts" (OuterVolumeSpecName: "scripts") pod "4f5d8489-0104-4980-9f26-1330336ef7f0" (UID: "4f5d8489-0104-4980-9f26-1330336ef7f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.487944 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4f5d8489-0104-4980-9f26-1330336ef7f0" (UID: "4f5d8489-0104-4980-9f26-1330336ef7f0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.488678 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f5d8489-0104-4980-9f26-1330336ef7f0-kube-api-access-xz26r" (OuterVolumeSpecName: "kube-api-access-xz26r") pod "4f5d8489-0104-4980-9f26-1330336ef7f0" (UID: "4f5d8489-0104-4980-9f26-1330336ef7f0"). InnerVolumeSpecName "kube-api-access-xz26r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.507575 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-kube-api-access-jl6rp" (OuterVolumeSpecName: "kube-api-access-jl6rp") pod "a2dc5023-3f28-4e34-be5f-bc3f59188e0b" (UID: "a2dc5023-3f28-4e34-be5f-bc3f59188e0b"). InnerVolumeSpecName "kube-api-access-jl6rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.537812 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5c79966c-pthdw"] Mar 07 07:13:10 crc kubenswrapper[4941]: E0307 07:13:10.538160 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69eaf595-4875-440a-8b7f-b9dd8787c325" containerName="dnsmasq-dns" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.538173 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="69eaf595-4875-440a-8b7f-b9dd8787c325" containerName="dnsmasq-dns" Mar 07 07:13:10 crc kubenswrapper[4941]: E0307 07:13:10.538187 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2dc5023-3f28-4e34-be5f-bc3f59188e0b" containerName="neutron-db-sync" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.538193 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2dc5023-3f28-4e34-be5f-bc3f59188e0b" containerName="neutron-db-sync" Mar 07 07:13:10 crc kubenswrapper[4941]: E0307 07:13:10.538212 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5d8489-0104-4980-9f26-1330336ef7f0" containerName="keystone-bootstrap" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.538219 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5d8489-0104-4980-9f26-1330336ef7f0" containerName="keystone-bootstrap" Mar 07 07:13:10 crc kubenswrapper[4941]: E0307 07:13:10.538229 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69eaf595-4875-440a-8b7f-b9dd8787c325" containerName="init" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.538235 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="69eaf595-4875-440a-8b7f-b9dd8787c325" containerName="init" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.538394 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2dc5023-3f28-4e34-be5f-bc3f59188e0b" containerName="neutron-db-sync" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.538432 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f5d8489-0104-4980-9f26-1330336ef7f0" containerName="keystone-bootstrap" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.538439 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="69eaf595-4875-440a-8b7f-b9dd8787c325" containerName="dnsmasq-dns" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.538906 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.544833 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.545091 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.549440 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5c79966c-pthdw"] Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.551516 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-config-data" (OuterVolumeSpecName: "config-data") pod "4f5d8489-0104-4980-9f26-1330336ef7f0" (UID: "4f5d8489-0104-4980-9f26-1330336ef7f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.570685 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz26r\" (UniqueName: \"kubernetes.io/projected/4f5d8489-0104-4980-9f26-1330336ef7f0-kube-api-access-xz26r\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.570713 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.570723 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.570732 4941 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.570740 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl6rp\" (UniqueName: \"kubernetes.io/projected/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-kube-api-access-jl6rp\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.570749 4941 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.578135 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2dc5023-3f28-4e34-be5f-bc3f59188e0b" (UID: "a2dc5023-3f28-4e34-be5f-bc3f59188e0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.595577 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f5d8489-0104-4980-9f26-1330336ef7f0" (UID: "4f5d8489-0104-4980-9f26-1330336ef7f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.598580 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-config" (OuterVolumeSpecName: "config") pod "a2dc5023-3f28-4e34-be5f-bc3f59188e0b" (UID: "a2dc5023-3f28-4e34-be5f-bc3f59188e0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.608817 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66ff44db99-phlcm"] Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.610977 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.623452 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66ff44db99-phlcm"] Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676266 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-credential-keys\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676324 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-internal-tls-certs\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676351 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-public-tls-certs\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676367 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-config-data\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676399 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-dns-swift-storage-0\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676464 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4zrx\" (UniqueName: \"kubernetes.io/projected/4222d03b-3493-40c9-81e4-9818cd6e6cbf-kube-api-access-w4zrx\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676492 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-combined-ca-bundle\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676513 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-dns-svc\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676533 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-scripts\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676562 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-ovsdbserver-sb\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676582 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvvg6\" (UniqueName: \"kubernetes.io/projected/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-kube-api-access-vvvg6\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676601 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-config\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676630 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-fernet-keys\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676665 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-ovsdbserver-nb\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676712 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676722 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2dc5023-3f28-4e34-be5f-bc3f59188e0b-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.676731 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5d8489-0104-4980-9f26-1330336ef7f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.707097 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-576964b458-llgdq"] Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.708592 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.711709 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-m482l" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.711930 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.712122 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.712690 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.716939 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-576964b458-llgdq"] Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.777802 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-config\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.777878 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-ovndb-tls-certs\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.777900 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-config\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.777917 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-fernet-keys\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.777931 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-combined-ca-bundle\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.777995 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-ovsdbserver-nb\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.778021 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-credential-keys\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.778067 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-internal-tls-certs\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.778085 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-public-tls-certs\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.778101 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5blc\" (UniqueName: \"kubernetes.io/projected/1c05d887-e05c-4593-a5ad-76be76a9e637-kube-api-access-t5blc\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.778145 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-config-data\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.778173 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-dns-swift-storage-0\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.778230 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-httpd-config\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.778250 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4zrx\" (UniqueName: \"kubernetes.io/projected/4222d03b-3493-40c9-81e4-9818cd6e6cbf-kube-api-access-w4zrx\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.778298 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-combined-ca-bundle\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.778322 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-dns-svc\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.778340 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-scripts\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.778388 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-ovsdbserver-sb\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.778438 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvvg6\" (UniqueName: \"kubernetes.io/projected/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-kube-api-access-vvvg6\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.780118 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-config\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.783027 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-ovsdbserver-nb\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.783104 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-dns-svc\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.783561 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-dns-swift-storage-0\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.785029 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-credential-keys\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.785951 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-combined-ca-bundle\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.786523 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-config-data\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.790917 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-fernet-keys\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.791937 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-ovsdbserver-sb\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.795173 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-public-tls-certs\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.797858 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-scripts\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.799605 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-internal-tls-certs\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.803446 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4zrx\" (UniqueName: \"kubernetes.io/projected/4222d03b-3493-40c9-81e4-9818cd6e6cbf-kube-api-access-w4zrx\") pod \"dnsmasq-dns-66ff44db99-phlcm\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.804462 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvvg6\" (UniqueName: \"kubernetes.io/projected/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-kube-api-access-vvvg6\") pod \"keystone-5c79966c-pthdw\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.867172 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.879618 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-ovndb-tls-certs\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.879666 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-config\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.879697 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-combined-ca-bundle\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.879789 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5blc\" (UniqueName: \"kubernetes.io/projected/1c05d887-e05c-4593-a5ad-76be76a9e637-kube-api-access-t5blc\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.879877 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-httpd-config\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.884280 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-httpd-config\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.884949 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-ovndb-tls-certs\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.885085 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-combined-ca-bundle\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.886429 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-config\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.904162 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5blc\" (UniqueName: \"kubernetes.io/projected/1c05d887-e05c-4593-a5ad-76be76a9e637-kube-api-access-t5blc\") pod \"neutron-576964b458-llgdq\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.918083 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57fd5df48d-tt655"] Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.919438 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.937397 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.949535 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57fd5df48d-tt655"] Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.981860 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-combined-ca-bundle\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.981910 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-httpd-config\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.981949 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-config\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.981975 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggvsr\" (UniqueName: \"kubernetes.io/projected/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-kube-api-access-ggvsr\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:10 crc kubenswrapper[4941]: I0307 07:13:10.982049 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-ovndb-tls-certs\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.045692 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.083167 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-ovndb-tls-certs\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.083224 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-combined-ca-bundle\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.083250 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-httpd-config\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.083294 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-config\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.083337 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggvsr\" (UniqueName: \"kubernetes.io/projected/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-kube-api-access-ggvsr\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.087789 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-ovndb-tls-certs\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.093496 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-combined-ca-bundle\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.103119 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-httpd-config\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.104316 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-config\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.105696 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggvsr\" (UniqueName: \"kubernetes.io/projected/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-kube-api-access-ggvsr\") pod \"neutron-57fd5df48d-tt655\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.239291 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.273400 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5c79966c-pthdw"] Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.425100 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84812b46-cde1-4da2-9a4d-e0e6013c56fe","Type":"ContainerStarted","Data":"cf7db1800007c2fa7b28afcfff85a564b2956d6e80f417769b84258a62bb7c77"} Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.426396 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5c79966c-pthdw" event={"ID":"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b","Type":"ContainerStarted","Data":"a775c081a00ad28718e2adc3495a133d34974dbc35088e0179f8cb2b6dcb41b2"} Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.551568 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66ff44db99-phlcm"] Mar 07 07:13:11 crc kubenswrapper[4941]: I0307 07:13:11.778801 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-576964b458-llgdq"] Mar 07 07:13:11 crc kubenswrapper[4941]: W0307 07:13:11.855670 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c05d887_e05c_4593_a5ad_76be76a9e637.slice/crio-ae27b0b0a7d883e7d199f560fdf2575e3aa824f63defc11b15fd15492f8755c3 WatchSource:0}: Error finding container ae27b0b0a7d883e7d199f560fdf2575e3aa824f63defc11b15fd15492f8755c3: Status 404 returned error can't find the container with id ae27b0b0a7d883e7d199f560fdf2575e3aa824f63defc11b15fd15492f8755c3 Mar 07 07:13:12 crc kubenswrapper[4941]: I0307 07:13:12.012484 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57fd5df48d-tt655"] Mar 07 07:13:12 crc kubenswrapper[4941]: W0307 07:13:12.016291 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3e4d16b_185a_49c5_b246_a0ed7b0efe9b.slice/crio-7903c5ea1653c603598d3a936763f53b1b537f74d13149d9ca44f761e5264d30 WatchSource:0}: Error finding container 7903c5ea1653c603598d3a936763f53b1b537f74d13149d9ca44f761e5264d30: Status 404 returned error can't find the container with id 7903c5ea1653c603598d3a936763f53b1b537f74d13149d9ca44f761e5264d30 Mar 07 07:13:12 crc kubenswrapper[4941]: I0307 07:13:12.472528 4941 generic.go:334] "Generic (PLEG): container finished" podID="4222d03b-3493-40c9-81e4-9818cd6e6cbf" containerID="6b5ee8c80bd049d6e322bcde1bfdb534eae6ba5a0aa7b5f2235d9ef66ee8788d" exitCode=0 Mar 07 07:13:12 crc kubenswrapper[4941]: I0307 07:13:12.475485 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66ff44db99-phlcm" event={"ID":"4222d03b-3493-40c9-81e4-9818cd6e6cbf","Type":"ContainerDied","Data":"6b5ee8c80bd049d6e322bcde1bfdb534eae6ba5a0aa7b5f2235d9ef66ee8788d"} Mar 07 07:13:12 crc kubenswrapper[4941]: I0307 07:13:12.475555 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66ff44db99-phlcm" event={"ID":"4222d03b-3493-40c9-81e4-9818cd6e6cbf","Type":"ContainerStarted","Data":"91745c1f74e5d09a6ff41647f0c937d22392d0322a9291b89e8627657acfd4cf"} Mar 07 07:13:12 crc kubenswrapper[4941]: I0307 07:13:12.509644 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57fd5df48d-tt655" event={"ID":"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b","Type":"ContainerStarted","Data":"3629b3f9473c21dc1f60d85a6d35e9abe12bf665705c1e70e6d78451d7d23a5f"} Mar 07 07:13:12 crc kubenswrapper[4941]: I0307 07:13:12.509689 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57fd5df48d-tt655" event={"ID":"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b","Type":"ContainerStarted","Data":"7903c5ea1653c603598d3a936763f53b1b537f74d13149d9ca44f761e5264d30"} Mar 07 07:13:12 crc kubenswrapper[4941]: I0307 07:13:12.522651 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5c79966c-pthdw" event={"ID":"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b","Type":"ContainerStarted","Data":"b109f7f57212d8e8c114a9a627a5a319764e1da9f3d67fde616e4969dbfc95a1"} Mar 07 07:13:12 crc kubenswrapper[4941]: I0307 07:13:12.522890 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:12 crc kubenswrapper[4941]: I0307 07:13:12.528430 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576964b458-llgdq" event={"ID":"1c05d887-e05c-4593-a5ad-76be76a9e637","Type":"ContainerStarted","Data":"a8a35b3deb5b1cf7ccde666a6775f2db92cfff20287912f391d7747193bdc3a2"} Mar 07 07:13:12 crc kubenswrapper[4941]: I0307 07:13:12.528467 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576964b458-llgdq" event={"ID":"1c05d887-e05c-4593-a5ad-76be76a9e637","Type":"ContainerStarted","Data":"ae27b0b0a7d883e7d199f560fdf2575e3aa824f63defc11b15fd15492f8755c3"} Mar 07 07:13:12 crc kubenswrapper[4941]: I0307 07:13:12.621223 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5c79966c-pthdw" podStartSLOduration=2.62120275 podStartE2EDuration="2.62120275s" podCreationTimestamp="2026-03-07 07:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:12.581826741 +0000 UTC m=+1289.534192206" watchObservedRunningTime="2026-03-07 07:13:12.62120275 +0000 UTC m=+1289.573568215" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.487743 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-576964b458-llgdq"] Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.528167 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c6df5b777-qhsgz"] Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.531982 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.536922 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.545284 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.547259 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c6df5b777-qhsgz"] Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.645463 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-public-tls-certs\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.645586 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-ovndb-tls-certs\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.645627 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-config\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.645660 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-internal-tls-certs\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.645683 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-httpd-config\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.645701 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27jhd\" (UniqueName: \"kubernetes.io/projected/d3cb3645-4e27-450f-a712-f656dfa9e8e1-kube-api-access-27jhd\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.645762 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-combined-ca-bundle\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.747091 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-public-tls-certs\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.747176 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-ovndb-tls-certs\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.747219 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-config\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.747252 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-internal-tls-certs\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.747275 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-httpd-config\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.747319 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27jhd\" (UniqueName: \"kubernetes.io/projected/d3cb3645-4e27-450f-a712-f656dfa9e8e1-kube-api-access-27jhd\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.747394 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-combined-ca-bundle\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.751668 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-combined-ca-bundle\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.751915 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-config\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.753244 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-public-tls-certs\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.754132 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-internal-tls-certs\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.761975 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-httpd-config\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.764185 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27jhd\" (UniqueName: \"kubernetes.io/projected/d3cb3645-4e27-450f-a712-f656dfa9e8e1-kube-api-access-27jhd\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.766636 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-ovndb-tls-certs\") pod \"neutron-7c6df5b777-qhsgz\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:13 crc kubenswrapper[4941]: I0307 07:13:13.856060 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:14 crc kubenswrapper[4941]: I0307 07:13:14.429246 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c6df5b777-qhsgz"] Mar 07 07:13:14 crc kubenswrapper[4941]: I0307 07:13:14.566770 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576964b458-llgdq" event={"ID":"1c05d887-e05c-4593-a5ad-76be76a9e637","Type":"ContainerStarted","Data":"2e9cb85c3d64ffb2cf0a7810e64b48c8b96c008de30eedddc48551d9400e3ad2"} Mar 07 07:13:14 crc kubenswrapper[4941]: I0307 07:13:14.567096 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-576964b458-llgdq" podUID="1c05d887-e05c-4593-a5ad-76be76a9e637" containerName="neutron-api" containerID="cri-o://a8a35b3deb5b1cf7ccde666a6775f2db92cfff20287912f391d7747193bdc3a2" gracePeriod=30 Mar 07 07:13:14 crc kubenswrapper[4941]: I0307 07:13:14.567245 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:14 crc kubenswrapper[4941]: I0307 07:13:14.567332 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-576964b458-llgdq" podUID="1c05d887-e05c-4593-a5ad-76be76a9e637" containerName="neutron-httpd" containerID="cri-o://2e9cb85c3d64ffb2cf0a7810e64b48c8b96c008de30eedddc48551d9400e3ad2" gracePeriod=30 Mar 07 07:13:14 crc kubenswrapper[4941]: I0307 07:13:14.574630 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c6df5b777-qhsgz" event={"ID":"d3cb3645-4e27-450f-a712-f656dfa9e8e1","Type":"ContainerStarted","Data":"1e08abf6cfacf4ab3fe67f49f69418d92c7363bf3dc7d19bd2ae73864fbbd87f"} Mar 07 07:13:14 crc kubenswrapper[4941]: I0307 07:13:14.585942 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66ff44db99-phlcm" event={"ID":"4222d03b-3493-40c9-81e4-9818cd6e6cbf","Type":"ContainerStarted","Data":"2c036da20ddd296adfeb420ce25d33ff31e9ffd433a28c985963ac40bec54e58"} Mar 07 07:13:14 crc kubenswrapper[4941]: I0307 07:13:14.586454 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:14 crc kubenswrapper[4941]: I0307 07:13:14.593673 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-576964b458-llgdq" podStartSLOduration=4.593653783 podStartE2EDuration="4.593653783s" podCreationTimestamp="2026-03-07 07:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:14.590553525 +0000 UTC m=+1291.542919000" watchObservedRunningTime="2026-03-07 07:13:14.593653783 +0000 UTC m=+1291.546019248" Mar 07 07:13:14 crc kubenswrapper[4941]: I0307 07:13:14.610206 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57fd5df48d-tt655" event={"ID":"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b","Type":"ContainerStarted","Data":"cfbe514f4242e0ae0b9007e0d67c894bbd9f91c60d5515850b52c12bba198d84"} Mar 07 07:13:14 crc kubenswrapper[4941]: I0307 07:13:14.611066 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:14 crc kubenswrapper[4941]: I0307 07:13:14.631834 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66ff44db99-phlcm" podStartSLOduration=4.631812242 podStartE2EDuration="4.631812242s" podCreationTimestamp="2026-03-07 07:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:14.619181871 +0000 UTC m=+1291.571547336" watchObservedRunningTime="2026-03-07 07:13:14.631812242 +0000 UTC m=+1291.584177707" Mar 07 07:13:14 crc kubenswrapper[4941]: I0307 07:13:14.658495 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57fd5df48d-tt655" podStartSLOduration=4.658473079 podStartE2EDuration="4.658473079s" podCreationTimestamp="2026-03-07 07:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:14.643793926 +0000 UTC m=+1291.596159391" watchObservedRunningTime="2026-03-07 07:13:14.658473079 +0000 UTC m=+1291.610838554" Mar 07 07:13:15 crc kubenswrapper[4941]: I0307 07:13:15.661359 4941 generic.go:334] "Generic (PLEG): container finished" podID="1c05d887-e05c-4593-a5ad-76be76a9e637" containerID="2e9cb85c3d64ffb2cf0a7810e64b48c8b96c008de30eedddc48551d9400e3ad2" exitCode=0 Mar 07 07:13:15 crc kubenswrapper[4941]: I0307 07:13:15.661959 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576964b458-llgdq" event={"ID":"1c05d887-e05c-4593-a5ad-76be76a9e637","Type":"ContainerDied","Data":"2e9cb85c3d64ffb2cf0a7810e64b48c8b96c008de30eedddc48551d9400e3ad2"} Mar 07 07:13:15 crc kubenswrapper[4941]: I0307 07:13:15.667609 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c6df5b777-qhsgz" event={"ID":"d3cb3645-4e27-450f-a712-f656dfa9e8e1","Type":"ContainerStarted","Data":"02af86707f926f63b94a89e930e5bad157f8952cf5491dcebdef8e9862f1f39e"} Mar 07 07:13:15 crc kubenswrapper[4941]: I0307 07:13:15.667660 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c6df5b777-qhsgz" event={"ID":"d3cb3645-4e27-450f-a712-f656dfa9e8e1","Type":"ContainerStarted","Data":"f2776ebe60f7ea12aae1e69439654467871a600e01895f035b01fd91b4e7c2d0"} Mar 07 07:13:15 crc kubenswrapper[4941]: I0307 07:13:15.667696 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:15 crc kubenswrapper[4941]: I0307 07:13:15.680469 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-k7f6s" event={"ID":"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6","Type":"ContainerStarted","Data":"1cc00922fe10127ab4fdd98f27c6d3cc813448e5d7ee0455b1fa2bf47a5b5470"} Mar 07 07:13:15 crc kubenswrapper[4941]: I0307 07:13:15.701103 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7c6df5b777-qhsgz" podStartSLOduration=2.701084551 podStartE2EDuration="2.701084551s" podCreationTimestamp="2026-03-07 07:13:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:15.689130318 +0000 UTC m=+1292.641495793" watchObservedRunningTime="2026-03-07 07:13:15.701084551 +0000 UTC m=+1292.653450016" Mar 07 07:13:15 crc kubenswrapper[4941]: I0307 07:13:15.731160 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-k7f6s" podStartSLOduration=3.085178148 podStartE2EDuration="39.731138374s" podCreationTimestamp="2026-03-07 07:12:36 +0000 UTC" firstStartedPulling="2026-03-07 07:12:37.688724194 +0000 UTC m=+1254.641089659" lastFinishedPulling="2026-03-07 07:13:14.33468442 +0000 UTC m=+1291.287049885" observedRunningTime="2026-03-07 07:13:15.727906052 +0000 UTC m=+1292.680271517" watchObservedRunningTime="2026-03-07 07:13:15.731138374 +0000 UTC m=+1292.683503949" Mar 07 07:13:18 crc kubenswrapper[4941]: I0307 07:13:18.557810 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 07 07:13:19 crc kubenswrapper[4941]: I0307 07:13:19.029901 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 07 07:13:20 crc kubenswrapper[4941]: I0307 07:13:20.940535 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:21 crc kubenswrapper[4941]: I0307 07:13:21.027501 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-754b99d75-mlbz8"] Mar 07 07:13:21 crc kubenswrapper[4941]: I0307 07:13:21.027798 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" podUID="a038eb59-eed0-442b-9076-5e5091511b2b" containerName="dnsmasq-dns" containerID="cri-o://a524cbc4e3f92c137635a3dcc300a6f8e28a2c4ef1de87f52d0017e6c248ac3d" gracePeriod=10 Mar 07 07:13:21 crc kubenswrapper[4941]: I0307 07:13:21.744571 4941 generic.go:334] "Generic (PLEG): container finished" podID="a038eb59-eed0-442b-9076-5e5091511b2b" containerID="a524cbc4e3f92c137635a3dcc300a6f8e28a2c4ef1de87f52d0017e6c248ac3d" exitCode=0 Mar 07 07:13:21 crc kubenswrapper[4941]: I0307 07:13:21.744746 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" event={"ID":"a038eb59-eed0-442b-9076-5e5091511b2b","Type":"ContainerDied","Data":"a524cbc4e3f92c137635a3dcc300a6f8e28a2c4ef1de87f52d0017e6c248ac3d"} Mar 07 07:13:21 crc kubenswrapper[4941]: I0307 07:13:21.746388 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-27tlg" event={"ID":"6bdf20f4-25fe-480f-9d5a-f593b6d9a763","Type":"ContainerStarted","Data":"2fe58906a6536fc9344008dd7883fbdd2d04d7c1f5a06fb56a2b123eda1a587c"} Mar 07 07:13:21 crc kubenswrapper[4941]: I0307 07:13:21.749492 4941 generic.go:334] "Generic (PLEG): container finished" podID="0e0538f9-8d7c-40cf-bc98-a165a41d1bf6" containerID="1cc00922fe10127ab4fdd98f27c6d3cc813448e5d7ee0455b1fa2bf47a5b5470" exitCode=0 Mar 07 07:13:21 crc kubenswrapper[4941]: I0307 07:13:21.749524 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-k7f6s" event={"ID":"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6","Type":"ContainerDied","Data":"1cc00922fe10127ab4fdd98f27c6d3cc813448e5d7ee0455b1fa2bf47a5b5470"} Mar 07 07:13:21 crc kubenswrapper[4941]: I0307 07:13:21.762219 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-27tlg" podStartSLOduration=6.801360468 podStartE2EDuration="45.762201318s" podCreationTimestamp="2026-03-07 07:12:36 +0000 UTC" firstStartedPulling="2026-03-07 07:12:38.038951643 +0000 UTC m=+1254.991317108" lastFinishedPulling="2026-03-07 07:13:16.999792493 +0000 UTC m=+1293.952157958" observedRunningTime="2026-03-07 07:13:21.761813668 +0000 UTC m=+1298.714179143" watchObservedRunningTime="2026-03-07 07:13:21.762201318 +0000 UTC m=+1298.714566783" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.327978 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.523174 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-ovsdbserver-nb\") pod \"a038eb59-eed0-442b-9076-5e5091511b2b\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.523219 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr585\" (UniqueName: \"kubernetes.io/projected/a038eb59-eed0-442b-9076-5e5091511b2b-kube-api-access-vr585\") pod \"a038eb59-eed0-442b-9076-5e5091511b2b\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.523267 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-ovsdbserver-sb\") pod \"a038eb59-eed0-442b-9076-5e5091511b2b\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.523326 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-dns-swift-storage-0\") pod \"a038eb59-eed0-442b-9076-5e5091511b2b\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.523360 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-dns-svc\") pod \"a038eb59-eed0-442b-9076-5e5091511b2b\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.523464 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-config\") pod \"a038eb59-eed0-442b-9076-5e5091511b2b\" (UID: \"a038eb59-eed0-442b-9076-5e5091511b2b\") " Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.529570 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a038eb59-eed0-442b-9076-5e5091511b2b-kube-api-access-vr585" (OuterVolumeSpecName: "kube-api-access-vr585") pod "a038eb59-eed0-442b-9076-5e5091511b2b" (UID: "a038eb59-eed0-442b-9076-5e5091511b2b"). InnerVolumeSpecName "kube-api-access-vr585". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.571072 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a038eb59-eed0-442b-9076-5e5091511b2b" (UID: "a038eb59-eed0-442b-9076-5e5091511b2b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.578077 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a038eb59-eed0-442b-9076-5e5091511b2b" (UID: "a038eb59-eed0-442b-9076-5e5091511b2b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.590983 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a038eb59-eed0-442b-9076-5e5091511b2b" (UID: "a038eb59-eed0-442b-9076-5e5091511b2b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.597449 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-config" (OuterVolumeSpecName: "config") pod "a038eb59-eed0-442b-9076-5e5091511b2b" (UID: "a038eb59-eed0-442b-9076-5e5091511b2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.598569 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a038eb59-eed0-442b-9076-5e5091511b2b" (UID: "a038eb59-eed0-442b-9076-5e5091511b2b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.625763 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.625812 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr585\" (UniqueName: \"kubernetes.io/projected/a038eb59-eed0-442b-9076-5e5091511b2b-kube-api-access-vr585\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.625828 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.625840 4941 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.625855 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.625867 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a038eb59-eed0-442b-9076-5e5091511b2b-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:22 crc kubenswrapper[4941]: E0307 07:13:22.655768 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.807823 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" event={"ID":"a038eb59-eed0-442b-9076-5e5091511b2b","Type":"ContainerDied","Data":"372006e51ba12b34f05c867978229560a3d9a3ba7b3ccebad756240704ba7e7e"} Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.808112 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.808149 4941 scope.go:117] "RemoveContainer" containerID="a524cbc4e3f92c137635a3dcc300a6f8e28a2c4ef1de87f52d0017e6c248ac3d" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.826633 4941 generic.go:334] "Generic (PLEG): container finished" podID="6bdf20f4-25fe-480f-9d5a-f593b6d9a763" containerID="2fe58906a6536fc9344008dd7883fbdd2d04d7c1f5a06fb56a2b123eda1a587c" exitCode=0 Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.826725 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-27tlg" event={"ID":"6bdf20f4-25fe-480f-9d5a-f593b6d9a763","Type":"ContainerDied","Data":"2fe58906a6536fc9344008dd7883fbdd2d04d7c1f5a06fb56a2b123eda1a587c"} Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.848658 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n6fhd" event={"ID":"3d89d1d4-04b0-4778-98d7-1cc12db0588b","Type":"ContainerStarted","Data":"0125632d69c55a0569c4381d80b85512d72762ed0a3f3134b448fae40348ea6f"} Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.853579 4941 scope.go:117] "RemoveContainer" containerID="cf86e02af468df69b400e7d7d3af3d7126462f978890d7694d091fdee1137954" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.873295 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerName="ceilometer-notification-agent" containerID="cri-o://e4bcb35ecd4b1cf4d395e9040c63991738ac3a9b1b0e5cec3a66ee4d743594b2" gracePeriod=30 Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.873537 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84812b46-cde1-4da2-9a4d-e0e6013c56fe","Type":"ContainerStarted","Data":"828e887fc0ee20e783a3104ea1a99c5d27947e5d236fd5823846e604788b2535"} Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.873579 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerName="proxy-httpd" containerID="cri-o://828e887fc0ee20e783a3104ea1a99c5d27947e5d236fd5823846e604788b2535" gracePeriod=30 Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.873602 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.873633 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerName="sg-core" containerID="cri-o://cf7db1800007c2fa7b28afcfff85a564b2956d6e80f417769b84258a62bb7c77" gracePeriod=30 Mar 07 07:13:22 crc kubenswrapper[4941]: I0307 07:13:22.901795 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-n6fhd" podStartSLOduration=2.601535282 podStartE2EDuration="46.901780451s" podCreationTimestamp="2026-03-07 07:12:36 +0000 UTC" firstStartedPulling="2026-03-07 07:12:38.038633965 +0000 UTC m=+1254.990999430" lastFinishedPulling="2026-03-07 07:13:22.338879124 +0000 UTC m=+1299.291244599" observedRunningTime="2026-03-07 07:13:22.897820901 +0000 UTC m=+1299.850186366" watchObservedRunningTime="2026-03-07 07:13:22.901780451 +0000 UTC m=+1299.854145916" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.063465 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-754b99d75-mlbz8"] Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.079468 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-754b99d75-mlbz8"] Mar 07 07:13:23 crc kubenswrapper[4941]: E0307 07:13:23.130837 4941 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda038eb59_eed0_442b_9076_5e5091511b2b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda038eb59_eed0_442b_9076_5e5091511b2b.slice/crio-372006e51ba12b34f05c867978229560a3d9a3ba7b3ccebad756240704ba7e7e\": RecentStats: unable to find data in memory cache]" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.315051 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.485480 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-scripts\") pod \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.485537 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpjls\" (UniqueName: \"kubernetes.io/projected/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-kube-api-access-lpjls\") pod \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.485616 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-db-sync-config-data\") pod \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.485643 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-config-data\") pod \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.485663 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-etc-machine-id\") pod \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.485757 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-combined-ca-bundle\") pod \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\" (UID: \"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6\") " Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.486625 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0e0538f9-8d7c-40cf-bc98-a165a41d1bf6" (UID: "0e0538f9-8d7c-40cf-bc98-a165a41d1bf6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.491628 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0e0538f9-8d7c-40cf-bc98-a165a41d1bf6" (UID: "0e0538f9-8d7c-40cf-bc98-a165a41d1bf6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.491672 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-scripts" (OuterVolumeSpecName: "scripts") pod "0e0538f9-8d7c-40cf-bc98-a165a41d1bf6" (UID: "0e0538f9-8d7c-40cf-bc98-a165a41d1bf6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.491725 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-kube-api-access-lpjls" (OuterVolumeSpecName: "kube-api-access-lpjls") pod "0e0538f9-8d7c-40cf-bc98-a165a41d1bf6" (UID: "0e0538f9-8d7c-40cf-bc98-a165a41d1bf6"). InnerVolumeSpecName "kube-api-access-lpjls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.527373 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e0538f9-8d7c-40cf-bc98-a165a41d1bf6" (UID: "0e0538f9-8d7c-40cf-bc98-a165a41d1bf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.531934 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-config-data" (OuterVolumeSpecName: "config-data") pod "0e0538f9-8d7c-40cf-bc98-a165a41d1bf6" (UID: "0e0538f9-8d7c-40cf-bc98-a165a41d1bf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.588048 4941 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.588089 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.588098 4941 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.588107 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.588116 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.588124 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpjls\" (UniqueName: \"kubernetes.io/projected/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6-kube-api-access-lpjls\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.884249 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-k7f6s" event={"ID":"0e0538f9-8d7c-40cf-bc98-a165a41d1bf6","Type":"ContainerDied","Data":"5870db5b8c8b29fde0cd60a93d792e082f4d6f245305d77e8ed5a4466819e3e2"} Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.884294 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-k7f6s" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.884313 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5870db5b8c8b29fde0cd60a93d792e082f4d6f245305d77e8ed5a4466819e3e2" Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.887088 4941 generic.go:334] "Generic (PLEG): container finished" podID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerID="828e887fc0ee20e783a3104ea1a99c5d27947e5d236fd5823846e604788b2535" exitCode=0 Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.887132 4941 generic.go:334] "Generic (PLEG): container finished" podID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerID="cf7db1800007c2fa7b28afcfff85a564b2956d6e80f417769b84258a62bb7c77" exitCode=2 Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.887164 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84812b46-cde1-4da2-9a4d-e0e6013c56fe","Type":"ContainerDied","Data":"828e887fc0ee20e783a3104ea1a99c5d27947e5d236fd5823846e604788b2535"} Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.887209 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84812b46-cde1-4da2-9a4d-e0e6013c56fe","Type":"ContainerDied","Data":"cf7db1800007c2fa7b28afcfff85a564b2956d6e80f417769b84258a62bb7c77"} Mar 07 07:13:23 crc kubenswrapper[4941]: I0307 07:13:23.976258 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a038eb59-eed0-442b-9076-5e5091511b2b" path="/var/lib/kubelet/pods/a038eb59-eed0-442b-9076-5e5091511b2b/volumes" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.089237 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 07 07:13:24 crc kubenswrapper[4941]: E0307 07:13:24.089702 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a038eb59-eed0-442b-9076-5e5091511b2b" containerName="init" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.089718 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="a038eb59-eed0-442b-9076-5e5091511b2b" containerName="init" Mar 07 07:13:24 crc kubenswrapper[4941]: E0307 07:13:24.089736 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0538f9-8d7c-40cf-bc98-a165a41d1bf6" containerName="cinder-db-sync" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.089743 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0538f9-8d7c-40cf-bc98-a165a41d1bf6" containerName="cinder-db-sync" Mar 07 07:13:24 crc kubenswrapper[4941]: E0307 07:13:24.089772 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a038eb59-eed0-442b-9076-5e5091511b2b" containerName="dnsmasq-dns" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.089780 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="a038eb59-eed0-442b-9076-5e5091511b2b" containerName="dnsmasq-dns" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.089985 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="a038eb59-eed0-442b-9076-5e5091511b2b" containerName="dnsmasq-dns" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.090036 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e0538f9-8d7c-40cf-bc98-a165a41d1bf6" containerName="cinder-db-sync" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.091478 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.097092 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.108624 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.108806 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.109096 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-66txm" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.115633 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.157650 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c65679777-cw55p"] Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.159394 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.170326 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c65679777-cw55p"] Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.200395 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17554198-2cb0-40a6-9cb0-bc7d4100db11-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.200456 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-config-data\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.200509 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv2bv\" (UniqueName: \"kubernetes.io/projected/17554198-2cb0-40a6-9cb0-bc7d4100db11-kube-api-access-mv2bv\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.200537 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.200561 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.200582 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-scripts\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.312554 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.312608 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv2bv\" (UniqueName: \"kubernetes.io/projected/17554198-2cb0-40a6-9cb0-bc7d4100db11-kube-api-access-mv2bv\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.312652 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-config\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.312671 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnmpc\" (UniqueName: \"kubernetes.io/projected/50f72ce7-5401-447a-9b93-6761d0db9c6a-kube-api-access-fnmpc\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.312697 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.312735 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.312757 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-scripts\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.312803 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.312848 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.312886 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17554198-2cb0-40a6-9cb0-bc7d4100db11-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.312911 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-config-data\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.312925 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-dns-svc\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.313316 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17554198-2cb0-40a6-9cb0-bc7d4100db11-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.320420 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-scripts\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.321059 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-config-data\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.323625 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.324751 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.336096 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv2bv\" (UniqueName: \"kubernetes.io/projected/17554198-2cb0-40a6-9cb0-bc7d4100db11-kube-api-access-mv2bv\") pod \"cinder-scheduler-0\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.354714 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.356211 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.362983 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.384625 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.393624 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-27tlg" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.414231 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.414292 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-dns-svc\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.414332 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.414364 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-config\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.414382 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnmpc\" (UniqueName: \"kubernetes.io/projected/50f72ce7-5401-447a-9b93-6761d0db9c6a-kube-api-access-fnmpc\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.414450 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.415587 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-config\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.415647 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.415898 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-dns-svc\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.416317 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.416915 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.423212 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.432066 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnmpc\" (UniqueName: \"kubernetes.io/projected/50f72ce7-5401-447a-9b93-6761d0db9c6a-kube-api-access-fnmpc\") pod \"dnsmasq-dns-6c65679777-cw55p\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.485954 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.517866 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-logs\") pod \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.517934 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntjlt\" (UniqueName: \"kubernetes.io/projected/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-kube-api-access-ntjlt\") pod \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.517976 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-combined-ca-bundle\") pod \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.518009 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-scripts\") pod \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.518052 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-config-data\") pod \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\" (UID: \"6bdf20f4-25fe-480f-9d5a-f593b6d9a763\") " Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.518287 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-config-data-custom\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.518319 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gspgq\" (UniqueName: \"kubernetes.io/projected/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-kube-api-access-gspgq\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.518352 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-logs\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.518372 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.518461 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-scripts\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.518491 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-config-data\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.518526 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.518670 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-logs" (OuterVolumeSpecName: "logs") pod "6bdf20f4-25fe-480f-9d5a-f593b6d9a763" (UID: "6bdf20f4-25fe-480f-9d5a-f593b6d9a763"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.524904 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-kube-api-access-ntjlt" (OuterVolumeSpecName: "kube-api-access-ntjlt") pod "6bdf20f4-25fe-480f-9d5a-f593b6d9a763" (UID: "6bdf20f4-25fe-480f-9d5a-f593b6d9a763"). InnerVolumeSpecName "kube-api-access-ntjlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.529253 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-scripts" (OuterVolumeSpecName: "scripts") pod "6bdf20f4-25fe-480f-9d5a-f593b6d9a763" (UID: "6bdf20f4-25fe-480f-9d5a-f593b6d9a763"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.551558 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bdf20f4-25fe-480f-9d5a-f593b6d9a763" (UID: "6bdf20f4-25fe-480f-9d5a-f593b6d9a763"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.551887 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.604670 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-config-data" (OuterVolumeSpecName: "config-data") pod "6bdf20f4-25fe-480f-9d5a-f593b6d9a763" (UID: "6bdf20f4-25fe-480f-9d5a-f593b6d9a763"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.619381 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84812b46-cde1-4da2-9a4d-e0e6013c56fe-run-httpd\") pod \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.619453 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-756qc\" (UniqueName: \"kubernetes.io/projected/84812b46-cde1-4da2-9a4d-e0e6013c56fe-kube-api-access-756qc\") pod \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.619506 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-sg-core-conf-yaml\") pod \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.619559 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84812b46-cde1-4da2-9a4d-e0e6013c56fe-log-httpd\") pod \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.619607 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-scripts\") pod \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.619639 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-combined-ca-bundle\") pod \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.619777 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-config-data\") pod \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\" (UID: \"84812b46-cde1-4da2-9a4d-e0e6013c56fe\") " Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.619809 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84812b46-cde1-4da2-9a4d-e0e6013c56fe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "84812b46-cde1-4da2-9a4d-e0e6013c56fe" (UID: "84812b46-cde1-4da2-9a4d-e0e6013c56fe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.620037 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-scripts\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.620075 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-config-data\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.620112 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.620136 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-config-data-custom\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.620179 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84812b46-cde1-4da2-9a4d-e0e6013c56fe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "84812b46-cde1-4da2-9a4d-e0e6013c56fe" (UID: "84812b46-cde1-4da2-9a4d-e0e6013c56fe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.621023 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gspgq\" (UniqueName: \"kubernetes.io/projected/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-kube-api-access-gspgq\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.621079 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-logs\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.621103 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.621166 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.621177 4941 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84812b46-cde1-4da2-9a4d-e0e6013c56fe-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.621186 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntjlt\" (UniqueName: \"kubernetes.io/projected/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-kube-api-access-ntjlt\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.621196 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.621205 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.621213 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bdf20f4-25fe-480f-9d5a-f593b6d9a763-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.621221 4941 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84812b46-cde1-4da2-9a4d-e0e6013c56fe-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.621262 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.625572 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-scripts" (OuterVolumeSpecName: "scripts") pod "84812b46-cde1-4da2-9a4d-e0e6013c56fe" (UID: "84812b46-cde1-4da2-9a4d-e0e6013c56fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.626144 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-logs\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.626379 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84812b46-cde1-4da2-9a4d-e0e6013c56fe-kube-api-access-756qc" (OuterVolumeSpecName: "kube-api-access-756qc") pod "84812b46-cde1-4da2-9a4d-e0e6013c56fe" (UID: "84812b46-cde1-4da2-9a4d-e0e6013c56fe"). InnerVolumeSpecName "kube-api-access-756qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.633361 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.637356 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-scripts\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.641641 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-config-data\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.642256 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-config-data-custom\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.644672 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gspgq\" (UniqueName: \"kubernetes.io/projected/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-kube-api-access-gspgq\") pod \"cinder-api-0\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.714182 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84812b46-cde1-4da2-9a4d-e0e6013c56fe" (UID: "84812b46-cde1-4da2-9a4d-e0e6013c56fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.722849 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-756qc\" (UniqueName: \"kubernetes.io/projected/84812b46-cde1-4da2-9a4d-e0e6013c56fe-kube-api-access-756qc\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.722895 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.722905 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.725067 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.734247 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-config-data" (OuterVolumeSpecName: "config-data") pod "84812b46-cde1-4da2-9a4d-e0e6013c56fe" (UID: "84812b46-cde1-4da2-9a4d-e0e6013c56fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.735671 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "84812b46-cde1-4da2-9a4d-e0e6013c56fe" (UID: "84812b46-cde1-4da2-9a4d-e0e6013c56fe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.824060 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.824311 4941 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84812b46-cde1-4da2-9a4d-e0e6013c56fe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.873298 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-685ff95674-ldzd4"] Mar 07 07:13:24 crc kubenswrapper[4941]: E0307 07:13:24.873869 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerName="ceilometer-notification-agent" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.873898 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerName="ceilometer-notification-agent" Mar 07 07:13:24 crc kubenswrapper[4941]: E0307 07:13:24.873941 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerName="sg-core" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.873954 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerName="sg-core" Mar 07 07:13:24 crc kubenswrapper[4941]: E0307 07:13:24.873972 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bdf20f4-25fe-480f-9d5a-f593b6d9a763" containerName="placement-db-sync" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.873984 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdf20f4-25fe-480f-9d5a-f593b6d9a763" containerName="placement-db-sync" Mar 07 07:13:24 crc kubenswrapper[4941]: E0307 07:13:24.873998 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerName="proxy-httpd" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.874003 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerName="proxy-httpd" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.874201 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerName="proxy-httpd" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.874224 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bdf20f4-25fe-480f-9d5a-f593b6d9a763" containerName="placement-db-sync" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.874245 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerName="ceilometer-notification-agent" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.874273 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerName="sg-core" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.875765 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.880740 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.880833 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.928314 4941 generic.go:334] "Generic (PLEG): container finished" podID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" containerID="e4bcb35ecd4b1cf4d395e9040c63991738ac3a9b1b0e5cec3a66ee4d743594b2" exitCode=0 Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.928452 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.928553 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84812b46-cde1-4da2-9a4d-e0e6013c56fe","Type":"ContainerDied","Data":"e4bcb35ecd4b1cf4d395e9040c63991738ac3a9b1b0e5cec3a66ee4d743594b2"} Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.928596 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84812b46-cde1-4da2-9a4d-e0e6013c56fe","Type":"ContainerDied","Data":"557d5d876fe202f263820e189debca5c7a861c43a860ce78bf0fcbfc77f648d1"} Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.928633 4941 scope.go:117] "RemoveContainer" containerID="828e887fc0ee20e783a3104ea1a99c5d27947e5d236fd5823846e604788b2535" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.940091 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-27tlg" event={"ID":"6bdf20f4-25fe-480f-9d5a-f593b6d9a763","Type":"ContainerDied","Data":"20a566a61906ae264318e4af1081efebac177558582abbc426cb7c55a855c474"} Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.940136 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20a566a61906ae264318e4af1081efebac177558582abbc426cb7c55a855c474" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.940296 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-27tlg" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.961035 4941 scope.go:117] "RemoveContainer" containerID="cf7db1800007c2fa7b28afcfff85a564b2956d6e80f417769b84258a62bb7c77" Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.962617 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-685ff95674-ldzd4"] Mar 07 07:13:24 crc kubenswrapper[4941]: I0307 07:13:24.989220 4941 scope.go:117] "RemoveContainer" containerID="e4bcb35ecd4b1cf4d395e9040c63991738ac3a9b1b0e5cec3a66ee4d743594b2" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.005828 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.029086 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-scripts\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.029142 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8m4g\" (UniqueName: \"kubernetes.io/projected/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-kube-api-access-t8m4g\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.029231 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-config-data\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.029258 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-internal-tls-certs\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.029296 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-logs\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.029322 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-combined-ca-bundle\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.029389 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-public-tls-certs\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.037425 4941 scope.go:117] "RemoveContainer" containerID="828e887fc0ee20e783a3104ea1a99c5d27947e5d236fd5823846e604788b2535" Mar 07 07:13:25 crc kubenswrapper[4941]: E0307 07:13:25.037800 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"828e887fc0ee20e783a3104ea1a99c5d27947e5d236fd5823846e604788b2535\": container with ID starting with 828e887fc0ee20e783a3104ea1a99c5d27947e5d236fd5823846e604788b2535 not found: ID does not exist" containerID="828e887fc0ee20e783a3104ea1a99c5d27947e5d236fd5823846e604788b2535" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.037831 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"828e887fc0ee20e783a3104ea1a99c5d27947e5d236fd5823846e604788b2535"} err="failed to get container status \"828e887fc0ee20e783a3104ea1a99c5d27947e5d236fd5823846e604788b2535\": rpc error: code = NotFound desc = could not find container \"828e887fc0ee20e783a3104ea1a99c5d27947e5d236fd5823846e604788b2535\": container with ID starting with 828e887fc0ee20e783a3104ea1a99c5d27947e5d236fd5823846e604788b2535 not found: ID does not exist" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.037852 4941 scope.go:117] "RemoveContainer" containerID="cf7db1800007c2fa7b28afcfff85a564b2956d6e80f417769b84258a62bb7c77" Mar 07 07:13:25 crc kubenswrapper[4941]: E0307 07:13:25.038060 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf7db1800007c2fa7b28afcfff85a564b2956d6e80f417769b84258a62bb7c77\": container with ID starting with cf7db1800007c2fa7b28afcfff85a564b2956d6e80f417769b84258a62bb7c77 not found: ID does not exist" containerID="cf7db1800007c2fa7b28afcfff85a564b2956d6e80f417769b84258a62bb7c77" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.038082 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf7db1800007c2fa7b28afcfff85a564b2956d6e80f417769b84258a62bb7c77"} err="failed to get container status \"cf7db1800007c2fa7b28afcfff85a564b2956d6e80f417769b84258a62bb7c77\": rpc error: code = NotFound desc = could not find container \"cf7db1800007c2fa7b28afcfff85a564b2956d6e80f417769b84258a62bb7c77\": container with ID starting with cf7db1800007c2fa7b28afcfff85a564b2956d6e80f417769b84258a62bb7c77 not found: ID does not exist" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.038098 4941 scope.go:117] "RemoveContainer" containerID="e4bcb35ecd4b1cf4d395e9040c63991738ac3a9b1b0e5cec3a66ee4d743594b2" Mar 07 07:13:25 crc kubenswrapper[4941]: E0307 07:13:25.038850 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4bcb35ecd4b1cf4d395e9040c63991738ac3a9b1b0e5cec3a66ee4d743594b2\": container with ID starting with e4bcb35ecd4b1cf4d395e9040c63991738ac3a9b1b0e5cec3a66ee4d743594b2 not found: ID does not exist" containerID="e4bcb35ecd4b1cf4d395e9040c63991738ac3a9b1b0e5cec3a66ee4d743594b2" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.038875 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4bcb35ecd4b1cf4d395e9040c63991738ac3a9b1b0e5cec3a66ee4d743594b2"} err="failed to get container status \"e4bcb35ecd4b1cf4d395e9040c63991738ac3a9b1b0e5cec3a66ee4d743594b2\": rpc error: code = NotFound desc = could not find container \"e4bcb35ecd4b1cf4d395e9040c63991738ac3a9b1b0e5cec3a66ee4d743594b2\": container with ID starting with e4bcb35ecd4b1cf4d395e9040c63991738ac3a9b1b0e5cec3a66ee4d743594b2 not found: ID does not exist" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.102544 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.116714 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.127677 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.130322 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.130714 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-logs\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.130763 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-combined-ca-bundle\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.130857 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-public-tls-certs\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.130915 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-scripts\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.130952 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8m4g\" (UniqueName: \"kubernetes.io/projected/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-kube-api-access-t8m4g\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.131038 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-config-data\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.131068 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-internal-tls-certs\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.131334 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-logs\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.134280 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.134704 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.136885 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-internal-tls-certs\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.138859 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-public-tls-certs\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.145315 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-combined-ca-bundle\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.145787 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c65679777-cw55p"] Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.157336 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.159380 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8m4g\" (UniqueName: \"kubernetes.io/projected/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-kube-api-access-t8m4g\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.170771 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-scripts\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.175913 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-config-data\") pod \"placement-685ff95674-ldzd4\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.222769 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.232502 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6jsd\" (UniqueName: \"kubernetes.io/projected/e91dd3a3-9ab6-4183-8287-93b529afcd93-kube-api-access-x6jsd\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.232561 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91dd3a3-9ab6-4183-8287-93b529afcd93-run-httpd\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.232599 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.232660 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-scripts\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.232688 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.232716 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-config-data\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.232772 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91dd3a3-9ab6-4183-8287-93b529afcd93-log-httpd\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.333926 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6jsd\" (UniqueName: \"kubernetes.io/projected/e91dd3a3-9ab6-4183-8287-93b529afcd93-kube-api-access-x6jsd\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.334306 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91dd3a3-9ab6-4183-8287-93b529afcd93-run-httpd\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.334718 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91dd3a3-9ab6-4183-8287-93b529afcd93-run-httpd\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.334775 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.334809 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-scripts\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.334890 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.334916 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-config-data\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.334988 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91dd3a3-9ab6-4183-8287-93b529afcd93-log-httpd\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.335346 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91dd3a3-9ab6-4183-8287-93b529afcd93-log-httpd\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.340304 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.340997 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.341208 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-scripts\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.342054 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-config-data\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.352919 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6jsd\" (UniqueName: \"kubernetes.io/projected/e91dd3a3-9ab6-4183-8287-93b529afcd93-kube-api-access-x6jsd\") pod \"ceilometer-0\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.362824 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 07 07:13:25 crc kubenswrapper[4941]: W0307 07:13:25.369309 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf20d78e1_0e15_49bc_883d_a3d3ce0a1746.slice/crio-0ba493289b2e132ca32e5252dd33377c494cb604ebeff3858c01d76e063a78c1 WatchSource:0}: Error finding container 0ba493289b2e132ca32e5252dd33377c494cb604ebeff3858c01d76e063a78c1: Status 404 returned error can't find the container with id 0ba493289b2e132ca32e5252dd33377c494cb604ebeff3858c01d76e063a78c1 Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.486965 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.787256 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-685ff95674-ldzd4"] Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.952714 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f20d78e1-0e15-49bc-883d-a3d3ce0a1746","Type":"ContainerStarted","Data":"0ba493289b2e132ca32e5252dd33377c494cb604ebeff3858c01d76e063a78c1"} Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.953468 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"17554198-2cb0-40a6-9cb0-bc7d4100db11","Type":"ContainerStarted","Data":"f332dbbe105ff5b62574881287cc8c3dc525bb92de9252b0726a076d7bae3dca"} Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.954543 4941 generic.go:334] "Generic (PLEG): container finished" podID="50f72ce7-5401-447a-9b93-6761d0db9c6a" containerID="283944fffcc901ad82bbce7c77945f979ac62b8e6938453b65879e928c81ec34" exitCode=0 Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.973845 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84812b46-cde1-4da2-9a4d-e0e6013c56fe" path="/var/lib/kubelet/pods/84812b46-cde1-4da2-9a4d-e0e6013c56fe/volumes" Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.974475 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c65679777-cw55p" event={"ID":"50f72ce7-5401-447a-9b93-6761d0db9c6a","Type":"ContainerDied","Data":"283944fffcc901ad82bbce7c77945f979ac62b8e6938453b65879e928c81ec34"} Mar 07 07:13:25 crc kubenswrapper[4941]: I0307 07:13:25.974504 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c65679777-cw55p" event={"ID":"50f72ce7-5401-447a-9b93-6761d0db9c6a","Type":"ContainerStarted","Data":"2ce1414aa88077979788d98ddd5dd0891f8ad56c621211fab3c0f390eb3d7390"} Mar 07 07:13:26 crc kubenswrapper[4941]: I0307 07:13:26.170174 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:13:26 crc kubenswrapper[4941]: I0307 07:13:26.972120 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-685ff95674-ldzd4" event={"ID":"ad2b6a75-839f-4fec-9f12-fb520b44c7ce","Type":"ContainerStarted","Data":"fb6516769d261d733fc9be130e2e6aea292a7c5ff2e94299bbabd569bbe859a7"} Mar 07 07:13:26 crc kubenswrapper[4941]: I0307 07:13:26.972748 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-685ff95674-ldzd4" event={"ID":"ad2b6a75-839f-4fec-9f12-fb520b44c7ce","Type":"ContainerStarted","Data":"f8fe04edd08619bd4118ac72769de11cb1dc8824ae2dfa0799d60d9d7cab0731"} Mar 07 07:13:26 crc kubenswrapper[4941]: I0307 07:13:26.972767 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-685ff95674-ldzd4" event={"ID":"ad2b6a75-839f-4fec-9f12-fb520b44c7ce","Type":"ContainerStarted","Data":"06f78592f5289b476b99770c794d25dc413e7266988baee70a28e8e01616b6b7"} Mar 07 07:13:26 crc kubenswrapper[4941]: I0307 07:13:26.973906 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:26 crc kubenswrapper[4941]: I0307 07:13:26.973933 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:26 crc kubenswrapper[4941]: I0307 07:13:26.980676 4941 generic.go:334] "Generic (PLEG): container finished" podID="3d89d1d4-04b0-4778-98d7-1cc12db0588b" containerID="0125632d69c55a0569c4381d80b85512d72762ed0a3f3134b448fae40348ea6f" exitCode=0 Mar 07 07:13:26 crc kubenswrapper[4941]: I0307 07:13:26.980760 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n6fhd" event={"ID":"3d89d1d4-04b0-4778-98d7-1cc12db0588b","Type":"ContainerDied","Data":"0125632d69c55a0569c4381d80b85512d72762ed0a3f3134b448fae40348ea6f"} Mar 07 07:13:26 crc kubenswrapper[4941]: I0307 07:13:26.983102 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91dd3a3-9ab6-4183-8287-93b529afcd93","Type":"ContainerStarted","Data":"2d45ad1efacc5b7ee03ab5d48754510fd8ec19599a71274a20c858f917cde2f0"} Mar 07 07:13:26 crc kubenswrapper[4941]: I0307 07:13:26.991310 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c65679777-cw55p" event={"ID":"50f72ce7-5401-447a-9b93-6761d0db9c6a","Type":"ContainerStarted","Data":"22ffe531e01847af7ad108a5c553f268eb618deeba4f9532ec75229571a17c49"} Mar 07 07:13:26 crc kubenswrapper[4941]: I0307 07:13:26.992646 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:27 crc kubenswrapper[4941]: I0307 07:13:27.005743 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f20d78e1-0e15-49bc-883d-a3d3ce0a1746","Type":"ContainerStarted","Data":"bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97"} Mar 07 07:13:27 crc kubenswrapper[4941]: I0307 07:13:27.012394 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-685ff95674-ldzd4" podStartSLOduration=3.012376101 podStartE2EDuration="3.012376101s" podCreationTimestamp="2026-03-07 07:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:26.998684414 +0000 UTC m=+1303.951049879" watchObservedRunningTime="2026-03-07 07:13:27.012376101 +0000 UTC m=+1303.964741566" Mar 07 07:13:27 crc kubenswrapper[4941]: I0307 07:13:27.012774 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"17554198-2cb0-40a6-9cb0-bc7d4100db11","Type":"ContainerStarted","Data":"abf9e05262fd9eec016fbcb29868135f679a4f2b131103dd36bb8ff0c2c83895"} Mar 07 07:13:27 crc kubenswrapper[4941]: I0307 07:13:27.022104 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c65679777-cw55p" podStartSLOduration=3.022082878 podStartE2EDuration="3.022082878s" podCreationTimestamp="2026-03-07 07:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:27.019063461 +0000 UTC m=+1303.971428936" watchObservedRunningTime="2026-03-07 07:13:27.022082878 +0000 UTC m=+1303.974448353" Mar 07 07:13:27 crc kubenswrapper[4941]: I0307 07:13:27.026361 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-754b99d75-mlbz8" podUID="a038eb59-eed0-442b-9076-5e5091511b2b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: i/o timeout" Mar 07 07:13:27 crc kubenswrapper[4941]: I0307 07:13:27.461492 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.022704 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91dd3a3-9ab6-4183-8287-93b529afcd93","Type":"ContainerStarted","Data":"60317e62b500b74652eac982fea934a674e730ed4d2b573d4ef18c1146ab1cbd"} Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.022944 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91dd3a3-9ab6-4183-8287-93b529afcd93","Type":"ContainerStarted","Data":"65a4666ce8e68c7938a912563839662d2d2bdd66f54dbd2cdd96614dc19f52d8"} Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.024620 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f20d78e1-0e15-49bc-883d-a3d3ce0a1746","Type":"ContainerStarted","Data":"3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a"} Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.025472 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.028902 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"17554198-2cb0-40a6-9cb0-bc7d4100db11","Type":"ContainerStarted","Data":"55cfd6edb738e3a443de2b2d93aff7eec87f9d6b3bb5f1c6484f8ab4999bb04a"} Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.053215 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.053197128 podStartE2EDuration="4.053197128s" podCreationTimestamp="2026-03-07 07:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:28.045105383 +0000 UTC m=+1304.997470848" watchObservedRunningTime="2026-03-07 07:13:28.053197128 +0000 UTC m=+1305.005562593" Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.632519 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n6fhd" Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.654150 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.666464573 podStartE2EDuration="4.65411558s" podCreationTimestamp="2026-03-07 07:13:24 +0000 UTC" firstStartedPulling="2026-03-07 07:13:25.044949837 +0000 UTC m=+1301.997315302" lastFinishedPulling="2026-03-07 07:13:26.032600844 +0000 UTC m=+1302.984966309" observedRunningTime="2026-03-07 07:13:28.081380544 +0000 UTC m=+1305.033746009" watchObservedRunningTime="2026-03-07 07:13:28.65411558 +0000 UTC m=+1305.606481045" Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.697131 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn56w\" (UniqueName: \"kubernetes.io/projected/3d89d1d4-04b0-4778-98d7-1cc12db0588b-kube-api-access-nn56w\") pod \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\" (UID: \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\") " Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.697332 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d89d1d4-04b0-4778-98d7-1cc12db0588b-combined-ca-bundle\") pod \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\" (UID: \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\") " Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.697456 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d89d1d4-04b0-4778-98d7-1cc12db0588b-db-sync-config-data\") pod \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\" (UID: \"3d89d1d4-04b0-4778-98d7-1cc12db0588b\") " Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.703411 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d89d1d4-04b0-4778-98d7-1cc12db0588b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3d89d1d4-04b0-4778-98d7-1cc12db0588b" (UID: "3d89d1d4-04b0-4778-98d7-1cc12db0588b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.707600 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d89d1d4-04b0-4778-98d7-1cc12db0588b-kube-api-access-nn56w" (OuterVolumeSpecName: "kube-api-access-nn56w") pod "3d89d1d4-04b0-4778-98d7-1cc12db0588b" (UID: "3d89d1d4-04b0-4778-98d7-1cc12db0588b"). InnerVolumeSpecName "kube-api-access-nn56w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.756676 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d89d1d4-04b0-4778-98d7-1cc12db0588b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d89d1d4-04b0-4778-98d7-1cc12db0588b" (UID: "3d89d1d4-04b0-4778-98d7-1cc12db0588b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.799666 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d89d1d4-04b0-4778-98d7-1cc12db0588b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.799703 4941 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d89d1d4-04b0-4778-98d7-1cc12db0588b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:28 crc kubenswrapper[4941]: I0307 07:13:28.799719 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn56w\" (UniqueName: \"kubernetes.io/projected/3d89d1d4-04b0-4778-98d7-1cc12db0588b-kube-api-access-nn56w\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.042284 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n6fhd" event={"ID":"3d89d1d4-04b0-4778-98d7-1cc12db0588b","Type":"ContainerDied","Data":"f6f10e75b2145c325cdfbb2f7fb58a3081e1cded05afa3510dc280f3d740bd7c"} Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.042318 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6f10e75b2145c325cdfbb2f7fb58a3081e1cded05afa3510dc280f3d740bd7c" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.042511 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n6fhd" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.043109 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f20d78e1-0e15-49bc-883d-a3d3ce0a1746" containerName="cinder-api-log" containerID="cri-o://bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97" gracePeriod=30 Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.043529 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f20d78e1-0e15-49bc-883d-a3d3ce0a1746" containerName="cinder-api" containerID="cri-o://3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a" gracePeriod=30 Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.272993 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-57855ff457-mshjt"] Mar 07 07:13:29 crc kubenswrapper[4941]: E0307 07:13:29.283666 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d89d1d4-04b0-4778-98d7-1cc12db0588b" containerName="barbican-db-sync" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.283698 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d89d1d4-04b0-4778-98d7-1cc12db0588b" containerName="barbican-db-sync" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.283996 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d89d1d4-04b0-4778-98d7-1cc12db0588b" containerName="barbican-db-sync" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.284855 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-57855ff457-mshjt"] Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.284942 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.288682 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jd92q" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.288829 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.288932 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.345848 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-86888b7b66-mgpdx"] Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.347425 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.353651 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.365814 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-86888b7b66-mgpdx"] Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.417338 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twnms\" (UniqueName: \"kubernetes.io/projected/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-kube-api-access-twnms\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.417382 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-combined-ca-bundle\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.417417 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-combined-ca-bundle\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.417466 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-config-data-custom\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.417482 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-logs\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.417502 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4l9f\" (UniqueName: \"kubernetes.io/projected/e27683db-592f-485a-93b3-93273e1644c3-kube-api-access-l4l9f\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.417537 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-config-data-custom\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.417585 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-config-data\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.417609 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-config-data\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.417630 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e27683db-592f-485a-93b3-93273e1644c3-logs\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.421607 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c65679777-cw55p"] Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.426018 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.466701 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cbf7756bf-kxnb4"] Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.469497 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.515461 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cbf7756bf-kxnb4"] Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.518830 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-swift-storage-0\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519114 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-config-data\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519139 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519170 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-config-data\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519191 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e27683db-592f-485a-93b3-93273e1644c3-logs\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519219 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-config\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519247 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfhjb\" (UniqueName: \"kubernetes.io/projected/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-kube-api-access-hfhjb\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519274 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twnms\" (UniqueName: \"kubernetes.io/projected/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-kube-api-access-twnms\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519293 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-combined-ca-bundle\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519317 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-combined-ca-bundle\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519341 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519377 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-svc\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519426 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-config-data-custom\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519442 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-logs\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519467 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4l9f\" (UniqueName: \"kubernetes.io/projected/e27683db-592f-485a-93b3-93273e1644c3-kube-api-access-l4l9f\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.519505 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-config-data-custom\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.524503 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-logs\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.524835 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e27683db-592f-485a-93b3-93273e1644c3-logs\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.531946 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-config-data\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.538883 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-config-data-custom\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.542041 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-combined-ca-bundle\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.548398 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-config-data-custom\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.550841 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-combined-ca-bundle\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.555867 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-config-data\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.559275 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twnms\" (UniqueName: \"kubernetes.io/projected/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-kube-api-access-twnms\") pod \"barbican-keystone-listener-86888b7b66-mgpdx\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.559613 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4l9f\" (UniqueName: \"kubernetes.io/projected/e27683db-592f-485a-93b3-93273e1644c3-kube-api-access-l4l9f\") pod \"barbican-worker-57855ff457-mshjt\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.570713 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-665f7c588-lcghc"] Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.572059 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.574390 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.595395 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-665f7c588-lcghc"] Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.607065 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.620401 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-combined-ca-bundle\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.620450 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb79q\" (UniqueName: \"kubernetes.io/projected/68568417-4fd1-4914-8efd-2acbbedc66f9-kube-api-access-cb79q\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.620481 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-config-data\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.620507 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-swift-storage-0\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.620524 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68568417-4fd1-4914-8efd-2acbbedc66f9-logs\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.620566 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.620600 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-config\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.620618 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-config-data-custom\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.620636 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfhjb\" (UniqueName: \"kubernetes.io/projected/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-kube-api-access-hfhjb\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.620667 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.620691 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-svc\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.621654 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-svc\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.622276 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-config\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.624154 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.624473 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-swift-storage-0\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.624479 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.641463 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfhjb\" (UniqueName: \"kubernetes.io/projected/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-kube-api-access-hfhjb\") pod \"dnsmasq-dns-5cbf7756bf-kxnb4\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.671879 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.723226 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68568417-4fd1-4914-8efd-2acbbedc66f9-logs\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.723351 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-config-data-custom\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.723463 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-combined-ca-bundle\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.723489 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb79q\" (UniqueName: \"kubernetes.io/projected/68568417-4fd1-4914-8efd-2acbbedc66f9-kube-api-access-cb79q\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.723533 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-config-data\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.730059 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-config-data\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.730308 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68568417-4fd1-4914-8efd-2acbbedc66f9-logs\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.736084 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-config-data-custom\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.743116 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-combined-ca-bundle\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.761766 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb79q\" (UniqueName: \"kubernetes.io/projected/68568417-4fd1-4914-8efd-2acbbedc66f9-kube-api-access-cb79q\") pod \"barbican-api-665f7c588-lcghc\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.812903 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:29 crc kubenswrapper[4941]: I0307 07:13:29.916847 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.017907 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.069364 4941 generic.go:334] "Generic (PLEG): container finished" podID="f20d78e1-0e15-49bc-883d-a3d3ce0a1746" containerID="3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a" exitCode=0 Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.069415 4941 generic.go:334] "Generic (PLEG): container finished" podID="f20d78e1-0e15-49bc-883d-a3d3ce0a1746" containerID="bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97" exitCode=143 Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.069470 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f20d78e1-0e15-49bc-883d-a3d3ce0a1746","Type":"ContainerDied","Data":"3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a"} Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.069623 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f20d78e1-0e15-49bc-883d-a3d3ce0a1746","Type":"ContainerDied","Data":"bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97"} Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.069640 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f20d78e1-0e15-49bc-883d-a3d3ce0a1746","Type":"ContainerDied","Data":"0ba493289b2e132ca32e5252dd33377c494cb604ebeff3858c01d76e063a78c1"} Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.069656 4941 scope.go:117] "RemoveContainer" containerID="3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.069872 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.086998 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91dd3a3-9ab6-4183-8287-93b529afcd93","Type":"ContainerStarted","Data":"5f1b9e84f63b5ec6fa685f6c77e6d290c2fd261231331c2de886282c82884767"} Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.087212 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c65679777-cw55p" podUID="50f72ce7-5401-447a-9b93-6761d0db9c6a" containerName="dnsmasq-dns" containerID="cri-o://22ffe531e01847af7ad108a5c553f268eb618deeba4f9532ec75229571a17c49" gracePeriod=10 Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.099190 4941 scope.go:117] "RemoveContainer" containerID="bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.131228 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gspgq\" (UniqueName: \"kubernetes.io/projected/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-kube-api-access-gspgq\") pod \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.131367 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-etc-machine-id\") pod \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.131510 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-logs\") pod \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.131551 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-scripts\") pod \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.131727 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-combined-ca-bundle\") pod \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.131775 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-config-data-custom\") pod \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.131813 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-config-data\") pod \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\" (UID: \"f20d78e1-0e15-49bc-883d-a3d3ce0a1746\") " Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.131724 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f20d78e1-0e15-49bc-883d-a3d3ce0a1746" (UID: "f20d78e1-0e15-49bc-883d-a3d3ce0a1746"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.133716 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-logs" (OuterVolumeSpecName: "logs") pod "f20d78e1-0e15-49bc-883d-a3d3ce0a1746" (UID: "f20d78e1-0e15-49bc-883d-a3d3ce0a1746"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.139630 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-kube-api-access-gspgq" (OuterVolumeSpecName: "kube-api-access-gspgq") pod "f20d78e1-0e15-49bc-883d-a3d3ce0a1746" (UID: "f20d78e1-0e15-49bc-883d-a3d3ce0a1746"). InnerVolumeSpecName "kube-api-access-gspgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.139692 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f20d78e1-0e15-49bc-883d-a3d3ce0a1746" (UID: "f20d78e1-0e15-49bc-883d-a3d3ce0a1746"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.143364 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-scripts" (OuterVolumeSpecName: "scripts") pod "f20d78e1-0e15-49bc-883d-a3d3ce0a1746" (UID: "f20d78e1-0e15-49bc-883d-a3d3ce0a1746"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.169663 4941 scope.go:117] "RemoveContainer" containerID="3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a" Mar 07 07:13:30 crc kubenswrapper[4941]: E0307 07:13:30.170505 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a\": container with ID starting with 3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a not found: ID does not exist" containerID="3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.170535 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a"} err="failed to get container status \"3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a\": rpc error: code = NotFound desc = could not find container \"3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a\": container with ID starting with 3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a not found: ID does not exist" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.170562 4941 scope.go:117] "RemoveContainer" containerID="bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97" Mar 07 07:13:30 crc kubenswrapper[4941]: E0307 07:13:30.171570 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97\": container with ID starting with bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97 not found: ID does not exist" containerID="bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.171611 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97"} err="failed to get container status \"bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97\": rpc error: code = NotFound desc = could not find container \"bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97\": container with ID starting with bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97 not found: ID does not exist" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.171628 4941 scope.go:117] "RemoveContainer" containerID="3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.171927 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a"} err="failed to get container status \"3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a\": rpc error: code = NotFound desc = could not find container \"3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a\": container with ID starting with 3dff7750f59e3b0f1d73bbd3b7eae8a886cce7ef6593619660aca1607d9e901a not found: ID does not exist" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.171944 4941 scope.go:117] "RemoveContainer" containerID="bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.171984 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-57855ff457-mshjt"] Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.172216 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97"} err="failed to get container status \"bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97\": rpc error: code = NotFound desc = could not find container \"bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97\": container with ID starting with bb5f063572d079c6b1242a9e44003968db5dc69b91500094bd6e4f439d229e97 not found: ID does not exist" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.179545 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f20d78e1-0e15-49bc-883d-a3d3ce0a1746" (UID: "f20d78e1-0e15-49bc-883d-a3d3ce0a1746"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:30 crc kubenswrapper[4941]: W0307 07:13:30.186002 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode27683db_592f_485a_93b3_93273e1644c3.slice/crio-8343bff8e6b25be2f73ec6dd7a92169c700eb64d11432479ec75e91e67910eaa WatchSource:0}: Error finding container 8343bff8e6b25be2f73ec6dd7a92169c700eb64d11432479ec75e91e67910eaa: Status 404 returned error can't find the container with id 8343bff8e6b25be2f73ec6dd7a92169c700eb64d11432479ec75e91e67910eaa Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.206360 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-config-data" (OuterVolumeSpecName: "config-data") pod "f20d78e1-0e15-49bc-883d-a3d3ce0a1746" (UID: "f20d78e1-0e15-49bc-883d-a3d3ce0a1746"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.234615 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.234952 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.234962 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.234972 4941 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.234980 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.234989 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gspgq\" (UniqueName: \"kubernetes.io/projected/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-kube-api-access-gspgq\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.234998 4941 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f20d78e1-0e15-49bc-883d-a3d3ce0a1746-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.279337 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-86888b7b66-mgpdx"] Mar 07 07:13:30 crc kubenswrapper[4941]: W0307 07:13:30.285827 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea4583b7_29d7_466d_8c3d_ad9981ebc66d.slice/crio-47388a5e4567c5d86ee5a3b0900abeeb98ae104b94c78f56ce15948eaf6de1b5 WatchSource:0}: Error finding container 47388a5e4567c5d86ee5a3b0900abeeb98ae104b94c78f56ce15948eaf6de1b5: Status 404 returned error can't find the container with id 47388a5e4567c5d86ee5a3b0900abeeb98ae104b94c78f56ce15948eaf6de1b5 Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.406453 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.432302 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.447463 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 07 07:13:30 crc kubenswrapper[4941]: E0307 07:13:30.447830 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20d78e1-0e15-49bc-883d-a3d3ce0a1746" containerName="cinder-api" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.447841 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20d78e1-0e15-49bc-883d-a3d3ce0a1746" containerName="cinder-api" Mar 07 07:13:30 crc kubenswrapper[4941]: E0307 07:13:30.447853 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20d78e1-0e15-49bc-883d-a3d3ce0a1746" containerName="cinder-api-log" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.447861 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20d78e1-0e15-49bc-883d-a3d3ce0a1746" containerName="cinder-api-log" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.448028 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f20d78e1-0e15-49bc-883d-a3d3ce0a1746" containerName="cinder-api" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.448043 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f20d78e1-0e15-49bc-883d-a3d3ce0a1746" containerName="cinder-api-log" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.449026 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.451980 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.452232 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.452355 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.456696 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.526532 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-665f7c588-lcghc"] Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.538278 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cbf7756bf-kxnb4"] Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.540339 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-config-data\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.540374 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwxkq\" (UniqueName: \"kubernetes.io/projected/753c78f9-47e6-4098-91fa-9adac0997ba4-kube-api-access-bwxkq\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.540522 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.540614 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/753c78f9-47e6-4098-91fa-9adac0997ba4-logs\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.540642 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-scripts\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.540669 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-config-data-custom\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.540719 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/753c78f9-47e6-4098-91fa-9adac0997ba4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.540774 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.540842 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-public-tls-certs\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.642773 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.642866 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-public-tls-certs\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.642898 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-config-data\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.642922 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwxkq\" (UniqueName: \"kubernetes.io/projected/753c78f9-47e6-4098-91fa-9adac0997ba4-kube-api-access-bwxkq\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.642995 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.643049 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-scripts\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.643066 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/753c78f9-47e6-4098-91fa-9adac0997ba4-logs\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.643088 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-config-data-custom\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.643124 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/753c78f9-47e6-4098-91fa-9adac0997ba4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.644225 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/753c78f9-47e6-4098-91fa-9adac0997ba4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.644256 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/753c78f9-47e6-4098-91fa-9adac0997ba4-logs\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.647554 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-public-tls-certs\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.647625 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-config-data-custom\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.647828 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.648425 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-scripts\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.649219 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:30 crc kubenswrapper[4941]: I0307 07:13:30.649804 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-config-data\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.010300 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwxkq\" (UniqueName: \"kubernetes.io/projected/753c78f9-47e6-4098-91fa-9adac0997ba4-kube-api-access-bwxkq\") pod \"cinder-api-0\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " pod="openstack/cinder-api-0" Mar 07 07:13:31 crc kubenswrapper[4941]: W0307 07:13:31.068151 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3f0f8f_0b63_4775_a794_adb5f51cfe66.slice/crio-521b29dcdf0aae20d2f26d535950e34532f6eb437ccb32433c66e3fac28df7ac WatchSource:0}: Error finding container 521b29dcdf0aae20d2f26d535950e34532f6eb437ccb32433c66e3fac28df7ac: Status 404 returned error can't find the container with id 521b29dcdf0aae20d2f26d535950e34532f6eb437ccb32433c66e3fac28df7ac Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.078017 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.114282 4941 generic.go:334] "Generic (PLEG): container finished" podID="50f72ce7-5401-447a-9b93-6761d0db9c6a" containerID="22ffe531e01847af7ad108a5c553f268eb618deeba4f9532ec75229571a17c49" exitCode=0 Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.114356 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c65679777-cw55p" event={"ID":"50f72ce7-5401-447a-9b93-6761d0db9c6a","Type":"ContainerDied","Data":"22ffe531e01847af7ad108a5c553f268eb618deeba4f9532ec75229571a17c49"} Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.114388 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c65679777-cw55p" event={"ID":"50f72ce7-5401-447a-9b93-6761d0db9c6a","Type":"ContainerDied","Data":"2ce1414aa88077979788d98ddd5dd0891f8ad56c621211fab3c0f390eb3d7390"} Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.114419 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ce1414aa88077979788d98ddd5dd0891f8ad56c621211fab3c0f390eb3d7390" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.116815 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" event={"ID":"ce3f0f8f-0b63-4775-a794-adb5f51cfe66","Type":"ContainerStarted","Data":"521b29dcdf0aae20d2f26d535950e34532f6eb437ccb32433c66e3fac28df7ac"} Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.117829 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57855ff457-mshjt" event={"ID":"e27683db-592f-485a-93b3-93273e1644c3","Type":"ContainerStarted","Data":"8343bff8e6b25be2f73ec6dd7a92169c700eb64d11432479ec75e91e67910eaa"} Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.120541 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" event={"ID":"ea4583b7-29d7-466d-8c3d-ad9981ebc66d","Type":"ContainerStarted","Data":"47388a5e4567c5d86ee5a3b0900abeeb98ae104b94c78f56ce15948eaf6de1b5"} Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.125054 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-665f7c588-lcghc" event={"ID":"68568417-4fd1-4914-8efd-2acbbedc66f9","Type":"ContainerStarted","Data":"214f1f5cca2bbf407bc3c2f4c9c59f889a190330dd924884de2e7c380a4384f2"} Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.337185 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.462828 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-dns-swift-storage-0\") pod \"50f72ce7-5401-447a-9b93-6761d0db9c6a\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.463240 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-dns-svc\") pod \"50f72ce7-5401-447a-9b93-6761d0db9c6a\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.463330 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-config\") pod \"50f72ce7-5401-447a-9b93-6761d0db9c6a\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.463464 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-ovsdbserver-sb\") pod \"50f72ce7-5401-447a-9b93-6761d0db9c6a\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.463558 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-ovsdbserver-nb\") pod \"50f72ce7-5401-447a-9b93-6761d0db9c6a\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.463627 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnmpc\" (UniqueName: \"kubernetes.io/projected/50f72ce7-5401-447a-9b93-6761d0db9c6a-kube-api-access-fnmpc\") pod \"50f72ce7-5401-447a-9b93-6761d0db9c6a\" (UID: \"50f72ce7-5401-447a-9b93-6761d0db9c6a\") " Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.484710 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f72ce7-5401-447a-9b93-6761d0db9c6a-kube-api-access-fnmpc" (OuterVolumeSpecName: "kube-api-access-fnmpc") pod "50f72ce7-5401-447a-9b93-6761d0db9c6a" (UID: "50f72ce7-5401-447a-9b93-6761d0db9c6a"). InnerVolumeSpecName "kube-api-access-fnmpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.546482 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "50f72ce7-5401-447a-9b93-6761d0db9c6a" (UID: "50f72ce7-5401-447a-9b93-6761d0db9c6a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.549532 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-config" (OuterVolumeSpecName: "config") pod "50f72ce7-5401-447a-9b93-6761d0db9c6a" (UID: "50f72ce7-5401-447a-9b93-6761d0db9c6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.562245 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "50f72ce7-5401-447a-9b93-6761d0db9c6a" (UID: "50f72ce7-5401-447a-9b93-6761d0db9c6a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.566977 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.567002 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnmpc\" (UniqueName: \"kubernetes.io/projected/50f72ce7-5401-447a-9b93-6761d0db9c6a-kube-api-access-fnmpc\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.567014 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.567022 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.569475 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "50f72ce7-5401-447a-9b93-6761d0db9c6a" (UID: "50f72ce7-5401-447a-9b93-6761d0db9c6a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.631148 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "50f72ce7-5401-447a-9b93-6761d0db9c6a" (UID: "50f72ce7-5401-447a-9b93-6761d0db9c6a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.668485 4941 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.668514 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50f72ce7-5401-447a-9b93-6761d0db9c6a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.782560 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 07 07:13:31 crc kubenswrapper[4941]: I0307 07:13:31.978190 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f20d78e1-0e15-49bc-883d-a3d3ce0a1746" path="/var/lib/kubelet/pods/f20d78e1-0e15-49bc-883d-a3d3ce0a1746/volumes" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.140007 4941 generic.go:334] "Generic (PLEG): container finished" podID="ce3f0f8f-0b63-4775-a794-adb5f51cfe66" containerID="63f5df511d133f8dafff8a42eb1c5eae7f26a1929b39aad4df638313ee7e7378" exitCode=0 Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.140091 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" event={"ID":"ce3f0f8f-0b63-4775-a794-adb5f51cfe66","Type":"ContainerDied","Data":"63f5df511d133f8dafff8a42eb1c5eae7f26a1929b39aad4df638313ee7e7378"} Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.143231 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"753c78f9-47e6-4098-91fa-9adac0997ba4","Type":"ContainerStarted","Data":"ddb6dc5958da3cee1b1aef7e9e6302f3ced067cfdeaa1fa3a76fe919de368e89"} Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.156666 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91dd3a3-9ab6-4183-8287-93b529afcd93","Type":"ContainerStarted","Data":"a6dabd7977184e5a3593081e782e51ca525e584cff685bf0e23886d6b0f8374b"} Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.157236 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.170711 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c65679777-cw55p" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.171535 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-665f7c588-lcghc" event={"ID":"68568417-4fd1-4914-8efd-2acbbedc66f9","Type":"ContainerStarted","Data":"d7b7aba73cabcd5439449d884b8172422532ad213ce8fff947418235d94207b9"} Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.171583 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-665f7c588-lcghc" event={"ID":"68568417-4fd1-4914-8efd-2acbbedc66f9","Type":"ContainerStarted","Data":"8d80060a277e091c98206a1506d94b681a4f895838a304b5b3f7604c17b5a9c2"} Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.171620 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.171677 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.183770 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9421772210000001 podStartE2EDuration="7.183755046s" podCreationTimestamp="2026-03-07 07:13:25 +0000 UTC" firstStartedPulling="2026-03-07 07:13:26.183511055 +0000 UTC m=+1303.135876520" lastFinishedPulling="2026-03-07 07:13:31.42508888 +0000 UTC m=+1308.377454345" observedRunningTime="2026-03-07 07:13:32.181335705 +0000 UTC m=+1309.133701170" watchObservedRunningTime="2026-03-07 07:13:32.183755046 +0000 UTC m=+1309.136120511" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.218583 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-665f7c588-lcghc" podStartSLOduration=3.218562449 podStartE2EDuration="3.218562449s" podCreationTimestamp="2026-03-07 07:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:32.217922233 +0000 UTC m=+1309.170287698" watchObservedRunningTime="2026-03-07 07:13:32.218562449 +0000 UTC m=+1309.170927914" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.245172 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c65679777-cw55p"] Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.258618 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c65679777-cw55p"] Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.646902 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-54977b5b64-bxjq6"] Mar 07 07:13:32 crc kubenswrapper[4941]: E0307 07:13:32.647347 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f72ce7-5401-447a-9b93-6761d0db9c6a" containerName="dnsmasq-dns" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.647371 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f72ce7-5401-447a-9b93-6761d0db9c6a" containerName="dnsmasq-dns" Mar 07 07:13:32 crc kubenswrapper[4941]: E0307 07:13:32.647427 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f72ce7-5401-447a-9b93-6761d0db9c6a" containerName="init" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.647437 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f72ce7-5401-447a-9b93-6761d0db9c6a" containerName="init" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.647694 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f72ce7-5401-447a-9b93-6761d0db9c6a" containerName="dnsmasq-dns" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.648844 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.651713 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.653045 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.661058 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54977b5b64-bxjq6"] Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.708978 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-internal-tls-certs\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.709034 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntm2d\" (UniqueName: \"kubernetes.io/projected/757b037d-b7b8-4690-93b9-ec85c5bf82db-kube-api-access-ntm2d\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.709057 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-public-tls-certs\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.709082 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-combined-ca-bundle\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.709123 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-config-data-custom\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.709160 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-config-data\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.709195 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/757b037d-b7b8-4690-93b9-ec85c5bf82db-logs\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.810905 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntm2d\" (UniqueName: \"kubernetes.io/projected/757b037d-b7b8-4690-93b9-ec85c5bf82db-kube-api-access-ntm2d\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.810944 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-public-tls-certs\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.810972 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-combined-ca-bundle\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.811019 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-config-data-custom\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.811056 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-config-data\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.811090 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/757b037d-b7b8-4690-93b9-ec85c5bf82db-logs\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.811178 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-internal-tls-certs\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.811713 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/757b037d-b7b8-4690-93b9-ec85c5bf82db-logs\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.821359 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-public-tls-certs\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.821462 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-config-data-custom\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.822847 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-internal-tls-certs\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.828127 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-config-data\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.832617 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntm2d\" (UniqueName: \"kubernetes.io/projected/757b037d-b7b8-4690-93b9-ec85c5bf82db-kube-api-access-ntm2d\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:32 crc kubenswrapper[4941]: I0307 07:13:32.832901 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-combined-ca-bundle\") pod \"barbican-api-54977b5b64-bxjq6\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:33 crc kubenswrapper[4941]: I0307 07:13:33.000943 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:33 crc kubenswrapper[4941]: I0307 07:13:33.181835 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"753c78f9-47e6-4098-91fa-9adac0997ba4","Type":"ContainerStarted","Data":"36e0c01fd1cc1fab82790d367c1f67d709d44426d74406326bf00d6f4c0369ff"} Mar 07 07:13:33 crc kubenswrapper[4941]: I0307 07:13:33.982683 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f72ce7-5401-447a-9b93-6761d0db9c6a" path="/var/lib/kubelet/pods/50f72ce7-5401-447a-9b93-6761d0db9c6a/volumes" Mar 07 07:13:34 crc kubenswrapper[4941]: I0307 07:13:34.179284 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54977b5b64-bxjq6"] Mar 07 07:13:34 crc kubenswrapper[4941]: W0307 07:13:34.187922 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod757b037d_b7b8_4690_93b9_ec85c5bf82db.slice/crio-652e37556e4bca6529743cd5de087a63bc1a8c0e445000eab404d433d4401493 WatchSource:0}: Error finding container 652e37556e4bca6529743cd5de087a63bc1a8c0e445000eab404d433d4401493: Status 404 returned error can't find the container with id 652e37556e4bca6529743cd5de087a63bc1a8c0e445000eab404d433d4401493 Mar 07 07:13:34 crc kubenswrapper[4941]: I0307 07:13:34.212998 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" event={"ID":"ce3f0f8f-0b63-4775-a794-adb5f51cfe66","Type":"ContainerStarted","Data":"32c5917b694cba85411826ec18fe5346f9c4d4e9441c2f979e5480f971e8e488"} Mar 07 07:13:34 crc kubenswrapper[4941]: I0307 07:13:34.214176 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:34 crc kubenswrapper[4941]: I0307 07:13:34.221288 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57855ff457-mshjt" event={"ID":"e27683db-592f-485a-93b3-93273e1644c3","Type":"ContainerStarted","Data":"5ed5ef6d5be2531a2b10e8ac859bcb1d8a560dd65521a1b989c82ce3f9e87c02"} Mar 07 07:13:34 crc kubenswrapper[4941]: I0307 07:13:34.223125 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" event={"ID":"ea4583b7-29d7-466d-8c3d-ad9981ebc66d","Type":"ContainerStarted","Data":"3303b50f68a521a17ce6ea4bdfdd7f14e530a5feea2f3c7f904b8d9a9f946eb3"} Mar 07 07:13:34 crc kubenswrapper[4941]: I0307 07:13:34.239725 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" podStartSLOduration=5.239710178 podStartE2EDuration="5.239710178s" podCreationTimestamp="2026-03-07 07:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:34.233649654 +0000 UTC m=+1311.186015119" watchObservedRunningTime="2026-03-07 07:13:34.239710178 +0000 UTC m=+1311.192075643" Mar 07 07:13:34 crc kubenswrapper[4941]: I0307 07:13:34.730069 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 07 07:13:34 crc kubenswrapper[4941]: I0307 07:13:34.817658 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 07 07:13:35 crc kubenswrapper[4941]: I0307 07:13:35.232340 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"753c78f9-47e6-4098-91fa-9adac0997ba4","Type":"ContainerStarted","Data":"7309868f4caab95c79325c4137c9791aaa3b778c28a0d6e39b6d6ff175e4b90e"} Mar 07 07:13:35 crc kubenswrapper[4941]: I0307 07:13:35.232563 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 07 07:13:35 crc kubenswrapper[4941]: I0307 07:13:35.235477 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57855ff457-mshjt" event={"ID":"e27683db-592f-485a-93b3-93273e1644c3","Type":"ContainerStarted","Data":"1c3a413ba405f95e0a0014d1315bc4171811b9889fe35c0e478a43aa0b412bf3"} Mar 07 07:13:35 crc kubenswrapper[4941]: I0307 07:13:35.237997 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" event={"ID":"ea4583b7-29d7-466d-8c3d-ad9981ebc66d","Type":"ContainerStarted","Data":"22c1b092d3e9a8bdc4399876e578f0c462e9f77a91ed13354c75ce1d43091380"} Mar 07 07:13:35 crc kubenswrapper[4941]: I0307 07:13:35.241040 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54977b5b64-bxjq6" event={"ID":"757b037d-b7b8-4690-93b9-ec85c5bf82db","Type":"ContainerStarted","Data":"9602f92d7dcc10b2686d4e7085e3763f533c9e0a49800d5c37d92fe2fd53acf8"} Mar 07 07:13:35 crc kubenswrapper[4941]: I0307 07:13:35.241094 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54977b5b64-bxjq6" event={"ID":"757b037d-b7b8-4690-93b9-ec85c5bf82db","Type":"ContainerStarted","Data":"68a7bfdd9a6a2bbbe4e0f5d3c955d8f816e0ef8b87e76bd424b8caeec2a73bc3"} Mar 07 07:13:35 crc kubenswrapper[4941]: I0307 07:13:35.241108 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54977b5b64-bxjq6" event={"ID":"757b037d-b7b8-4690-93b9-ec85c5bf82db","Type":"ContainerStarted","Data":"652e37556e4bca6529743cd5de087a63bc1a8c0e445000eab404d433d4401493"} Mar 07 07:13:35 crc kubenswrapper[4941]: I0307 07:13:35.241420 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="17554198-2cb0-40a6-9cb0-bc7d4100db11" containerName="cinder-scheduler" containerID="cri-o://abf9e05262fd9eec016fbcb29868135f679a4f2b131103dd36bb8ff0c2c83895" gracePeriod=30 Mar 07 07:13:35 crc kubenswrapper[4941]: I0307 07:13:35.241441 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="17554198-2cb0-40a6-9cb0-bc7d4100db11" containerName="probe" containerID="cri-o://55cfd6edb738e3a443de2b2d93aff7eec87f9d6b3bb5f1c6484f8ab4999bb04a" gracePeriod=30 Mar 07 07:13:35 crc kubenswrapper[4941]: I0307 07:13:35.255066 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.255046878 podStartE2EDuration="5.255046878s" podCreationTimestamp="2026-03-07 07:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:35.254846503 +0000 UTC m=+1312.207211968" watchObservedRunningTime="2026-03-07 07:13:35.255046878 +0000 UTC m=+1312.207412343" Mar 07 07:13:35 crc kubenswrapper[4941]: I0307 07:13:35.279610 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" podStartSLOduration=2.956478706 podStartE2EDuration="6.279595521s" podCreationTimestamp="2026-03-07 07:13:29 +0000 UTC" firstStartedPulling="2026-03-07 07:13:30.289763444 +0000 UTC m=+1307.242128909" lastFinishedPulling="2026-03-07 07:13:33.612880249 +0000 UTC m=+1310.565245724" observedRunningTime="2026-03-07 07:13:35.270978822 +0000 UTC m=+1312.223344287" watchObservedRunningTime="2026-03-07 07:13:35.279595521 +0000 UTC m=+1312.231960976" Mar 07 07:13:35 crc kubenswrapper[4941]: I0307 07:13:35.293149 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-57855ff457-mshjt" podStartSLOduration=2.916321937 podStartE2EDuration="6.293131275s" podCreationTimestamp="2026-03-07 07:13:29 +0000 UTC" firstStartedPulling="2026-03-07 07:13:30.213143379 +0000 UTC m=+1307.165508834" lastFinishedPulling="2026-03-07 07:13:33.589952687 +0000 UTC m=+1310.542318172" observedRunningTime="2026-03-07 07:13:35.287769118 +0000 UTC m=+1312.240134593" watchObservedRunningTime="2026-03-07 07:13:35.293131275 +0000 UTC m=+1312.245496740" Mar 07 07:13:35 crc kubenswrapper[4941]: I0307 07:13:35.314250 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-54977b5b64-bxjq6" podStartSLOduration=3.31423085 podStartE2EDuration="3.31423085s" podCreationTimestamp="2026-03-07 07:13:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:35.3063508 +0000 UTC m=+1312.258716275" watchObservedRunningTime="2026-03-07 07:13:35.31423085 +0000 UTC m=+1312.266596315" Mar 07 07:13:36 crc kubenswrapper[4941]: I0307 07:13:36.253702 4941 generic.go:334] "Generic (PLEG): container finished" podID="17554198-2cb0-40a6-9cb0-bc7d4100db11" containerID="55cfd6edb738e3a443de2b2d93aff7eec87f9d6b3bb5f1c6484f8ab4999bb04a" exitCode=0 Mar 07 07:13:36 crc kubenswrapper[4941]: I0307 07:13:36.253783 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"17554198-2cb0-40a6-9cb0-bc7d4100db11","Type":"ContainerDied","Data":"55cfd6edb738e3a443de2b2d93aff7eec87f9d6b3bb5f1c6484f8ab4999bb04a"} Mar 07 07:13:36 crc kubenswrapper[4941]: I0307 07:13:36.256041 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:36 crc kubenswrapper[4941]: I0307 07:13:36.256225 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:38 crc kubenswrapper[4941]: I0307 07:13:38.955050 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.052947 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv2bv\" (UniqueName: \"kubernetes.io/projected/17554198-2cb0-40a6-9cb0-bc7d4100db11-kube-api-access-mv2bv\") pod \"17554198-2cb0-40a6-9cb0-bc7d4100db11\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.053047 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17554198-2cb0-40a6-9cb0-bc7d4100db11-etc-machine-id\") pod \"17554198-2cb0-40a6-9cb0-bc7d4100db11\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.053132 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-config-data\") pod \"17554198-2cb0-40a6-9cb0-bc7d4100db11\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.053173 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-scripts\") pod \"17554198-2cb0-40a6-9cb0-bc7d4100db11\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.053201 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-config-data-custom\") pod \"17554198-2cb0-40a6-9cb0-bc7d4100db11\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.053219 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-combined-ca-bundle\") pod \"17554198-2cb0-40a6-9cb0-bc7d4100db11\" (UID: \"17554198-2cb0-40a6-9cb0-bc7d4100db11\") " Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.053215 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17554198-2cb0-40a6-9cb0-bc7d4100db11-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "17554198-2cb0-40a6-9cb0-bc7d4100db11" (UID: "17554198-2cb0-40a6-9cb0-bc7d4100db11"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.053696 4941 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17554198-2cb0-40a6-9cb0-bc7d4100db11-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.060983 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17554198-2cb0-40a6-9cb0-bc7d4100db11-kube-api-access-mv2bv" (OuterVolumeSpecName: "kube-api-access-mv2bv") pod "17554198-2cb0-40a6-9cb0-bc7d4100db11" (UID: "17554198-2cb0-40a6-9cb0-bc7d4100db11"). InnerVolumeSpecName "kube-api-access-mv2bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.065554 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "17554198-2cb0-40a6-9cb0-bc7d4100db11" (UID: "17554198-2cb0-40a6-9cb0-bc7d4100db11"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.069563 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-scripts" (OuterVolumeSpecName: "scripts") pod "17554198-2cb0-40a6-9cb0-bc7d4100db11" (UID: "17554198-2cb0-40a6-9cb0-bc7d4100db11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.155609 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv2bv\" (UniqueName: \"kubernetes.io/projected/17554198-2cb0-40a6-9cb0-bc7d4100db11-kube-api-access-mv2bv\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.155649 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.155665 4941 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.217149 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17554198-2cb0-40a6-9cb0-bc7d4100db11" (UID: "17554198-2cb0-40a6-9cb0-bc7d4100db11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.246591 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-config-data" (OuterVolumeSpecName: "config-data") pod "17554198-2cb0-40a6-9cb0-bc7d4100db11" (UID: "17554198-2cb0-40a6-9cb0-bc7d4100db11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.258032 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.258067 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17554198-2cb0-40a6-9cb0-bc7d4100db11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.284782 4941 generic.go:334] "Generic (PLEG): container finished" podID="17554198-2cb0-40a6-9cb0-bc7d4100db11" containerID="abf9e05262fd9eec016fbcb29868135f679a4f2b131103dd36bb8ff0c2c83895" exitCode=0 Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.284833 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"17554198-2cb0-40a6-9cb0-bc7d4100db11","Type":"ContainerDied","Data":"abf9e05262fd9eec016fbcb29868135f679a4f2b131103dd36bb8ff0c2c83895"} Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.284836 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.284876 4941 scope.go:117] "RemoveContainer" containerID="55cfd6edb738e3a443de2b2d93aff7eec87f9d6b3bb5f1c6484f8ab4999bb04a" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.284864 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"17554198-2cb0-40a6-9cb0-bc7d4100db11","Type":"ContainerDied","Data":"f332dbbe105ff5b62574881287cc8c3dc525bb92de9252b0726a076d7bae3dca"} Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.325274 4941 scope.go:117] "RemoveContainer" containerID="abf9e05262fd9eec016fbcb29868135f679a4f2b131103dd36bb8ff0c2c83895" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.330849 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.346105 4941 scope.go:117] "RemoveContainer" containerID="55cfd6edb738e3a443de2b2d93aff7eec87f9d6b3bb5f1c6484f8ab4999bb04a" Mar 07 07:13:39 crc kubenswrapper[4941]: E0307 07:13:39.347571 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55cfd6edb738e3a443de2b2d93aff7eec87f9d6b3bb5f1c6484f8ab4999bb04a\": container with ID starting with 55cfd6edb738e3a443de2b2d93aff7eec87f9d6b3bb5f1c6484f8ab4999bb04a not found: ID does not exist" containerID="55cfd6edb738e3a443de2b2d93aff7eec87f9d6b3bb5f1c6484f8ab4999bb04a" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.347620 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55cfd6edb738e3a443de2b2d93aff7eec87f9d6b3bb5f1c6484f8ab4999bb04a"} err="failed to get container status \"55cfd6edb738e3a443de2b2d93aff7eec87f9d6b3bb5f1c6484f8ab4999bb04a\": rpc error: code = NotFound desc = could not find container \"55cfd6edb738e3a443de2b2d93aff7eec87f9d6b3bb5f1c6484f8ab4999bb04a\": container with ID starting with 55cfd6edb738e3a443de2b2d93aff7eec87f9d6b3bb5f1c6484f8ab4999bb04a not found: ID does not exist" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.347652 4941 scope.go:117] "RemoveContainer" containerID="abf9e05262fd9eec016fbcb29868135f679a4f2b131103dd36bb8ff0c2c83895" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.347763 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 07 07:13:39 crc kubenswrapper[4941]: E0307 07:13:39.347981 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abf9e05262fd9eec016fbcb29868135f679a4f2b131103dd36bb8ff0c2c83895\": container with ID starting with abf9e05262fd9eec016fbcb29868135f679a4f2b131103dd36bb8ff0c2c83895 not found: ID does not exist" containerID="abf9e05262fd9eec016fbcb29868135f679a4f2b131103dd36bb8ff0c2c83895" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.348022 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf9e05262fd9eec016fbcb29868135f679a4f2b131103dd36bb8ff0c2c83895"} err="failed to get container status \"abf9e05262fd9eec016fbcb29868135f679a4f2b131103dd36bb8ff0c2c83895\": rpc error: code = NotFound desc = could not find container \"abf9e05262fd9eec016fbcb29868135f679a4f2b131103dd36bb8ff0c2c83895\": container with ID starting with abf9e05262fd9eec016fbcb29868135f679a4f2b131103dd36bb8ff0c2c83895 not found: ID does not exist" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.355635 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 07 07:13:39 crc kubenswrapper[4941]: E0307 07:13:39.355976 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17554198-2cb0-40a6-9cb0-bc7d4100db11" containerName="cinder-scheduler" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.355994 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="17554198-2cb0-40a6-9cb0-bc7d4100db11" containerName="cinder-scheduler" Mar 07 07:13:39 crc kubenswrapper[4941]: E0307 07:13:39.356028 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17554198-2cb0-40a6-9cb0-bc7d4100db11" containerName="probe" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.356036 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="17554198-2cb0-40a6-9cb0-bc7d4100db11" containerName="probe" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.357154 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="17554198-2cb0-40a6-9cb0-bc7d4100db11" containerName="probe" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.357188 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="17554198-2cb0-40a6-9cb0-bc7d4100db11" containerName="cinder-scheduler" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.358060 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.362444 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.390384 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.498025 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-scripts\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.498192 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlxkt\" (UniqueName: \"kubernetes.io/projected/317acc48-d39a-4c99-8a4e-ef91b0fc3894-kube-api-access-nlxkt\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.498246 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-config-data\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.498322 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.498531 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/317acc48-d39a-4c99-8a4e-ef91b0fc3894-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.498614 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.600104 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-scripts\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.600224 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlxkt\" (UniqueName: \"kubernetes.io/projected/317acc48-d39a-4c99-8a4e-ef91b0fc3894-kube-api-access-nlxkt\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.600265 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-config-data\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.600334 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.600366 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/317acc48-d39a-4c99-8a4e-ef91b0fc3894-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.600421 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.600855 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/317acc48-d39a-4c99-8a4e-ef91b0fc3894-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.604142 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-scripts\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.604222 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.604629 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.609574 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-config-data\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.619748 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlxkt\" (UniqueName: \"kubernetes.io/projected/317acc48-d39a-4c99-8a4e-ef91b0fc3894-kube-api-access-nlxkt\") pod \"cinder-scheduler-0\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.687548 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.815668 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.903870 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66ff44db99-phlcm"] Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.904101 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66ff44db99-phlcm" podUID="4222d03b-3493-40c9-81e4-9818cd6e6cbf" containerName="dnsmasq-dns" containerID="cri-o://2c036da20ddd296adfeb420ce25d33ff31e9ffd433a28c985963ac40bec54e58" gracePeriod=10 Mar 07 07:13:39 crc kubenswrapper[4941]: I0307 07:13:39.970778 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17554198-2cb0-40a6-9cb0-bc7d4100db11" path="/var/lib/kubelet/pods/17554198-2cb0-40a6-9cb0-bc7d4100db11/volumes" Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.263187 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.322935 4941 generic.go:334] "Generic (PLEG): container finished" podID="4222d03b-3493-40c9-81e4-9818cd6e6cbf" containerID="2c036da20ddd296adfeb420ce25d33ff31e9ffd433a28c985963ac40bec54e58" exitCode=0 Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.323010 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66ff44db99-phlcm" event={"ID":"4222d03b-3493-40c9-81e4-9818cd6e6cbf","Type":"ContainerDied","Data":"2c036da20ddd296adfeb420ce25d33ff31e9ffd433a28c985963ac40bec54e58"} Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.325195 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"317acc48-d39a-4c99-8a4e-ef91b0fc3894","Type":"ContainerStarted","Data":"9d00ca63bc8b899319b7b54870f0e72ce188d9a890bf5cc3b4845079d5e44aa7"} Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.402381 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.513734 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-config\") pod \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.513793 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-dns-swift-storage-0\") pod \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.513810 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-ovsdbserver-sb\") pod \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.513908 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4zrx\" (UniqueName: \"kubernetes.io/projected/4222d03b-3493-40c9-81e4-9818cd6e6cbf-kube-api-access-w4zrx\") pod \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.513932 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-dns-svc\") pod \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.513962 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-ovsdbserver-nb\") pod \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\" (UID: \"4222d03b-3493-40c9-81e4-9818cd6e6cbf\") " Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.539907 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4222d03b-3493-40c9-81e4-9818cd6e6cbf-kube-api-access-w4zrx" (OuterVolumeSpecName: "kube-api-access-w4zrx") pod "4222d03b-3493-40c9-81e4-9818cd6e6cbf" (UID: "4222d03b-3493-40c9-81e4-9818cd6e6cbf"). InnerVolumeSpecName "kube-api-access-w4zrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.595448 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4222d03b-3493-40c9-81e4-9818cd6e6cbf" (UID: "4222d03b-3493-40c9-81e4-9818cd6e6cbf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.614058 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4222d03b-3493-40c9-81e4-9818cd6e6cbf" (UID: "4222d03b-3493-40c9-81e4-9818cd6e6cbf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.616253 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4zrx\" (UniqueName: \"kubernetes.io/projected/4222d03b-3493-40c9-81e4-9818cd6e6cbf-kube-api-access-w4zrx\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.616279 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.616292 4941 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.656787 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4222d03b-3493-40c9-81e4-9818cd6e6cbf" (UID: "4222d03b-3493-40c9-81e4-9818cd6e6cbf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.660603 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-config" (OuterVolumeSpecName: "config") pod "4222d03b-3493-40c9-81e4-9818cd6e6cbf" (UID: "4222d03b-3493-40c9-81e4-9818cd6e6cbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.667378 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4222d03b-3493-40c9-81e4-9818cd6e6cbf" (UID: "4222d03b-3493-40c9-81e4-9818cd6e6cbf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.722529 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.722560 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:40 crc kubenswrapper[4941]: I0307 07:13:40.722571 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4222d03b-3493-40c9-81e4-9818cd6e6cbf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:41 crc kubenswrapper[4941]: I0307 07:13:41.047831 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-576964b458-llgdq" podUID="1c05d887-e05c-4593-a5ad-76be76a9e637" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.158:9696/\": dial tcp 10.217.0.158:9696: connect: connection refused" Mar 07 07:13:41 crc kubenswrapper[4941]: I0307 07:13:41.264110 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:41 crc kubenswrapper[4941]: I0307 07:13:41.346984 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66ff44db99-phlcm" Mar 07 07:13:41 crc kubenswrapper[4941]: I0307 07:13:41.347018 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66ff44db99-phlcm" event={"ID":"4222d03b-3493-40c9-81e4-9818cd6e6cbf","Type":"ContainerDied","Data":"91745c1f74e5d09a6ff41647f0c937d22392d0322a9291b89e8627657acfd4cf"} Mar 07 07:13:41 crc kubenswrapper[4941]: I0307 07:13:41.347069 4941 scope.go:117] "RemoveContainer" containerID="2c036da20ddd296adfeb420ce25d33ff31e9ffd433a28c985963ac40bec54e58" Mar 07 07:13:41 crc kubenswrapper[4941]: I0307 07:13:41.356349 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"317acc48-d39a-4c99-8a4e-ef91b0fc3894","Type":"ContainerStarted","Data":"1a0ea8a8f3c822cadf12d5e4208a3d4ccded7ff8311a82b01edce4e26bce47c2"} Mar 07 07:13:41 crc kubenswrapper[4941]: I0307 07:13:41.399808 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66ff44db99-phlcm"] Mar 07 07:13:41 crc kubenswrapper[4941]: I0307 07:13:41.403337 4941 scope.go:117] "RemoveContainer" containerID="6b5ee8c80bd049d6e322bcde1bfdb534eae6ba5a0aa7b5f2235d9ef66ee8788d" Mar 07 07:13:41 crc kubenswrapper[4941]: I0307 07:13:41.410456 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66ff44db99-phlcm"] Mar 07 07:13:41 crc kubenswrapper[4941]: I0307 07:13:41.966728 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4222d03b-3493-40c9-81e4-9818cd6e6cbf" path="/var/lib/kubelet/pods/4222d03b-3493-40c9-81e4-9818cd6e6cbf/volumes" Mar 07 07:13:41 crc kubenswrapper[4941]: I0307 07:13:41.997498 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:42 crc kubenswrapper[4941]: I0307 07:13:42.111960 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:42 crc kubenswrapper[4941]: I0307 07:13:42.365508 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"317acc48-d39a-4c99-8a4e-ef91b0fc3894","Type":"ContainerStarted","Data":"4e90b5427d15f1e301d7820993316b94e70b1e5e57e33af40b7531f4506658b7"} Mar 07 07:13:42 crc kubenswrapper[4941]: I0307 07:13:42.391279 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.391254701 podStartE2EDuration="3.391254701s" podCreationTimestamp="2026-03-07 07:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:42.387217019 +0000 UTC m=+1319.339582484" watchObservedRunningTime="2026-03-07 07:13:42.391254701 +0000 UTC m=+1319.343620166" Mar 07 07:13:42 crc kubenswrapper[4941]: I0307 07:13:42.917262 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:13:43 crc kubenswrapper[4941]: I0307 07:13:43.871260 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:13:43 crc kubenswrapper[4941]: I0307 07:13:43.942458 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57fd5df48d-tt655"] Mar 07 07:13:43 crc kubenswrapper[4941]: I0307 07:13:43.942662 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57fd5df48d-tt655" podUID="b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" containerName="neutron-api" containerID="cri-o://3629b3f9473c21dc1f60d85a6d35e9abe12bf665705c1e70e6d78451d7d23a5f" gracePeriod=30 Mar 07 07:13:43 crc kubenswrapper[4941]: I0307 07:13:43.943021 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57fd5df48d-tt655" podUID="b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" containerName="neutron-httpd" containerID="cri-o://cfbe514f4242e0ae0b9007e0d67c894bbd9f91c60d5515850b52c12bba198d84" gracePeriod=30 Mar 07 07:13:44 crc kubenswrapper[4941]: I0307 07:13:44.404351 4941 generic.go:334] "Generic (PLEG): container finished" podID="b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" containerID="cfbe514f4242e0ae0b9007e0d67c894bbd9f91c60d5515850b52c12bba198d84" exitCode=0 Mar 07 07:13:44 crc kubenswrapper[4941]: I0307 07:13:44.404483 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57fd5df48d-tt655" event={"ID":"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b","Type":"ContainerDied","Data":"cfbe514f4242e0ae0b9007e0d67c894bbd9f91c60d5515850b52c12bba198d84"} Mar 07 07:13:44 crc kubenswrapper[4941]: I0307 07:13:44.612711 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 07 07:13:44 crc kubenswrapper[4941]: I0307 07:13:44.688711 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 07 07:13:44 crc kubenswrapper[4941]: I0307 07:13:44.947295 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:44 crc kubenswrapper[4941]: I0307 07:13:44.960978 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.070743 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-665f7c588-lcghc"] Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.070960 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-665f7c588-lcghc" podUID="68568417-4fd1-4914-8efd-2acbbedc66f9" containerName="barbican-api-log" containerID="cri-o://8d80060a277e091c98206a1506d94b681a4f895838a304b5b3f7604c17b5a9c2" gracePeriod=30 Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.071326 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-665f7c588-lcghc" podUID="68568417-4fd1-4914-8efd-2acbbedc66f9" containerName="barbican-api" containerID="cri-o://d7b7aba73cabcd5439449d884b8172422532ad213ce8fff947418235d94207b9" gracePeriod=30 Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.078053 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-665f7c588-lcghc" podUID="68568417-4fd1-4914-8efd-2acbbedc66f9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.169:9311/healthcheck\": EOF" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.374782 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-576964b458-llgdq_1c05d887-e05c-4593-a5ad-76be76a9e637/neutron-api/0.log" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.375106 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.432643 4941 generic.go:334] "Generic (PLEG): container finished" podID="68568417-4fd1-4914-8efd-2acbbedc66f9" containerID="8d80060a277e091c98206a1506d94b681a4f895838a304b5b3f7604c17b5a9c2" exitCode=143 Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.432722 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-665f7c588-lcghc" event={"ID":"68568417-4fd1-4914-8efd-2acbbedc66f9","Type":"ContainerDied","Data":"8d80060a277e091c98206a1506d94b681a4f895838a304b5b3f7604c17b5a9c2"} Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.435559 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-576964b458-llgdq_1c05d887-e05c-4593-a5ad-76be76a9e637/neutron-api/0.log" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.435601 4941 generic.go:334] "Generic (PLEG): container finished" podID="1c05d887-e05c-4593-a5ad-76be76a9e637" containerID="a8a35b3deb5b1cf7ccde666a6775f2db92cfff20287912f391d7747193bdc3a2" exitCode=137 Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.436642 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576964b458-llgdq" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.437210 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576964b458-llgdq" event={"ID":"1c05d887-e05c-4593-a5ad-76be76a9e637","Type":"ContainerDied","Data":"a8a35b3deb5b1cf7ccde666a6775f2db92cfff20287912f391d7747193bdc3a2"} Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.437246 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576964b458-llgdq" event={"ID":"1c05d887-e05c-4593-a5ad-76be76a9e637","Type":"ContainerDied","Data":"ae27b0b0a7d883e7d199f560fdf2575e3aa824f63defc11b15fd15492f8755c3"} Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.437296 4941 scope.go:117] "RemoveContainer" containerID="2e9cb85c3d64ffb2cf0a7810e64b48c8b96c008de30eedddc48551d9400e3ad2" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.470966 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-config\") pod \"1c05d887-e05c-4593-a5ad-76be76a9e637\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.471784 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5blc\" (UniqueName: \"kubernetes.io/projected/1c05d887-e05c-4593-a5ad-76be76a9e637-kube-api-access-t5blc\") pod \"1c05d887-e05c-4593-a5ad-76be76a9e637\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.471889 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-httpd-config\") pod \"1c05d887-e05c-4593-a5ad-76be76a9e637\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.471989 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-ovndb-tls-certs\") pod \"1c05d887-e05c-4593-a5ad-76be76a9e637\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.472245 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-combined-ca-bundle\") pod \"1c05d887-e05c-4593-a5ad-76be76a9e637\" (UID: \"1c05d887-e05c-4593-a5ad-76be76a9e637\") " Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.477409 4941 scope.go:117] "RemoveContainer" containerID="a8a35b3deb5b1cf7ccde666a6775f2db92cfff20287912f391d7747193bdc3a2" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.478599 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1c05d887-e05c-4593-a5ad-76be76a9e637" (UID: "1c05d887-e05c-4593-a5ad-76be76a9e637"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.479257 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c05d887-e05c-4593-a5ad-76be76a9e637-kube-api-access-t5blc" (OuterVolumeSpecName: "kube-api-access-t5blc") pod "1c05d887-e05c-4593-a5ad-76be76a9e637" (UID: "1c05d887-e05c-4593-a5ad-76be76a9e637"). InnerVolumeSpecName "kube-api-access-t5blc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.573199 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c05d887-e05c-4593-a5ad-76be76a9e637" (UID: "1c05d887-e05c-4593-a5ad-76be76a9e637"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.574527 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.574618 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5blc\" (UniqueName: \"kubernetes.io/projected/1c05d887-e05c-4593-a5ad-76be76a9e637-kube-api-access-t5blc\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.574675 4941 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.598117 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "1c05d887-e05c-4593-a5ad-76be76a9e637" (UID: "1c05d887-e05c-4593-a5ad-76be76a9e637"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.598395 4941 scope.go:117] "RemoveContainer" containerID="2e9cb85c3d64ffb2cf0a7810e64b48c8b96c008de30eedddc48551d9400e3ad2" Mar 07 07:13:45 crc kubenswrapper[4941]: E0307 07:13:45.601759 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e9cb85c3d64ffb2cf0a7810e64b48c8b96c008de30eedddc48551d9400e3ad2\": container with ID starting with 2e9cb85c3d64ffb2cf0a7810e64b48c8b96c008de30eedddc48551d9400e3ad2 not found: ID does not exist" containerID="2e9cb85c3d64ffb2cf0a7810e64b48c8b96c008de30eedddc48551d9400e3ad2" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.601820 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e9cb85c3d64ffb2cf0a7810e64b48c8b96c008de30eedddc48551d9400e3ad2"} err="failed to get container status \"2e9cb85c3d64ffb2cf0a7810e64b48c8b96c008de30eedddc48551d9400e3ad2\": rpc error: code = NotFound desc = could not find container \"2e9cb85c3d64ffb2cf0a7810e64b48c8b96c008de30eedddc48551d9400e3ad2\": container with ID starting with 2e9cb85c3d64ffb2cf0a7810e64b48c8b96c008de30eedddc48551d9400e3ad2 not found: ID does not exist" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.601887 4941 scope.go:117] "RemoveContainer" containerID="a8a35b3deb5b1cf7ccde666a6775f2db92cfff20287912f391d7747193bdc3a2" Mar 07 07:13:45 crc kubenswrapper[4941]: E0307 07:13:45.602535 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8a35b3deb5b1cf7ccde666a6775f2db92cfff20287912f391d7747193bdc3a2\": container with ID starting with a8a35b3deb5b1cf7ccde666a6775f2db92cfff20287912f391d7747193bdc3a2 not found: ID does not exist" containerID="a8a35b3deb5b1cf7ccde666a6775f2db92cfff20287912f391d7747193bdc3a2" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.607421 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a35b3deb5b1cf7ccde666a6775f2db92cfff20287912f391d7747193bdc3a2"} err="failed to get container status \"a8a35b3deb5b1cf7ccde666a6775f2db92cfff20287912f391d7747193bdc3a2\": rpc error: code = NotFound desc = could not find container \"a8a35b3deb5b1cf7ccde666a6775f2db92cfff20287912f391d7747193bdc3a2\": container with ID starting with a8a35b3deb5b1cf7ccde666a6775f2db92cfff20287912f391d7747193bdc3a2 not found: ID does not exist" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.639495 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-config" (OuterVolumeSpecName: "config") pod "1c05d887-e05c-4593-a5ad-76be76a9e637" (UID: "1c05d887-e05c-4593-a5ad-76be76a9e637"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.676718 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.676979 4941 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c05d887-e05c-4593-a5ad-76be76a9e637-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.790863 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-576964b458-llgdq"] Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.804880 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-576964b458-llgdq"] Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.872523 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 07 07:13:45 crc kubenswrapper[4941]: E0307 07:13:45.872932 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4222d03b-3493-40c9-81e4-9818cd6e6cbf" containerName="dnsmasq-dns" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.872948 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="4222d03b-3493-40c9-81e4-9818cd6e6cbf" containerName="dnsmasq-dns" Mar 07 07:13:45 crc kubenswrapper[4941]: E0307 07:13:45.872959 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4222d03b-3493-40c9-81e4-9818cd6e6cbf" containerName="init" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.872965 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="4222d03b-3493-40c9-81e4-9818cd6e6cbf" containerName="init" Mar 07 07:13:45 crc kubenswrapper[4941]: E0307 07:13:45.872974 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c05d887-e05c-4593-a5ad-76be76a9e637" containerName="neutron-httpd" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.872981 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c05d887-e05c-4593-a5ad-76be76a9e637" containerName="neutron-httpd" Mar 07 07:13:45 crc kubenswrapper[4941]: E0307 07:13:45.872990 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c05d887-e05c-4593-a5ad-76be76a9e637" containerName="neutron-api" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.872995 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c05d887-e05c-4593-a5ad-76be76a9e637" containerName="neutron-api" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.873184 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="4222d03b-3493-40c9-81e4-9818cd6e6cbf" containerName="dnsmasq-dns" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.873195 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c05d887-e05c-4593-a5ad-76be76a9e637" containerName="neutron-httpd" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.873206 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c05d887-e05c-4593-a5ad-76be76a9e637" containerName="neutron-api" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.873808 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.882257 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.883005 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-8dg8g" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.883104 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.883030 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.964836 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c05d887-e05c-4593-a5ad-76be76a9e637" path="/var/lib/kubelet/pods/1c05d887-e05c-4593-a5ad-76be76a9e637/volumes" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.984991 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b030b241-21f3-48a4-88de-c63abeddccb1-openstack-config\") pod \"openstackclient\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " pod="openstack/openstackclient" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.985124 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwv7c\" (UniqueName: \"kubernetes.io/projected/b030b241-21f3-48a4-88de-c63abeddccb1-kube-api-access-dwv7c\") pod \"openstackclient\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " pod="openstack/openstackclient" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.985183 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b030b241-21f3-48a4-88de-c63abeddccb1-openstack-config-secret\") pod \"openstackclient\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " pod="openstack/openstackclient" Mar 07 07:13:45 crc kubenswrapper[4941]: I0307 07:13:45.985465 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b030b241-21f3-48a4-88de-c63abeddccb1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " pod="openstack/openstackclient" Mar 07 07:13:46 crc kubenswrapper[4941]: I0307 07:13:46.087596 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b030b241-21f3-48a4-88de-c63abeddccb1-openstack-config\") pod \"openstackclient\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " pod="openstack/openstackclient" Mar 07 07:13:46 crc kubenswrapper[4941]: I0307 07:13:46.087714 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwv7c\" (UniqueName: \"kubernetes.io/projected/b030b241-21f3-48a4-88de-c63abeddccb1-kube-api-access-dwv7c\") pod \"openstackclient\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " pod="openstack/openstackclient" Mar 07 07:13:46 crc kubenswrapper[4941]: I0307 07:13:46.087761 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b030b241-21f3-48a4-88de-c63abeddccb1-openstack-config-secret\") pod \"openstackclient\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " pod="openstack/openstackclient" Mar 07 07:13:46 crc kubenswrapper[4941]: I0307 07:13:46.087885 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b030b241-21f3-48a4-88de-c63abeddccb1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " pod="openstack/openstackclient" Mar 07 07:13:46 crc kubenswrapper[4941]: I0307 07:13:46.088820 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b030b241-21f3-48a4-88de-c63abeddccb1-openstack-config\") pod \"openstackclient\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " pod="openstack/openstackclient" Mar 07 07:13:46 crc kubenswrapper[4941]: I0307 07:13:46.094526 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b030b241-21f3-48a4-88de-c63abeddccb1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " pod="openstack/openstackclient" Mar 07 07:13:46 crc kubenswrapper[4941]: I0307 07:13:46.095262 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b030b241-21f3-48a4-88de-c63abeddccb1-openstack-config-secret\") pod \"openstackclient\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " pod="openstack/openstackclient" Mar 07 07:13:46 crc kubenswrapper[4941]: I0307 07:13:46.103267 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwv7c\" (UniqueName: \"kubernetes.io/projected/b030b241-21f3-48a4-88de-c63abeddccb1-kube-api-access-dwv7c\") pod \"openstackclient\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " pod="openstack/openstackclient" Mar 07 07:13:46 crc kubenswrapper[4941]: I0307 07:13:46.193816 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 07 07:13:46 crc kubenswrapper[4941]: I0307 07:13:46.678717 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 07 07:13:47 crc kubenswrapper[4941]: I0307 07:13:47.460533 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b030b241-21f3-48a4-88de-c63abeddccb1","Type":"ContainerStarted","Data":"bf9fc5adff6a12c2cc02a46e6a6d97668a7056c45aff2c6fb85fdd468fd5b459"} Mar 07 07:13:48 crc kubenswrapper[4941]: I0307 07:13:48.506235 4941 generic.go:334] "Generic (PLEG): container finished" podID="b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" containerID="3629b3f9473c21dc1f60d85a6d35e9abe12bf665705c1e70e6d78451d7d23a5f" exitCode=0 Mar 07 07:13:48 crc kubenswrapper[4941]: I0307 07:13:48.506536 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57fd5df48d-tt655" event={"ID":"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b","Type":"ContainerDied","Data":"3629b3f9473c21dc1f60d85a6d35e9abe12bf665705c1e70e6d78451d7d23a5f"} Mar 07 07:13:48 crc kubenswrapper[4941]: I0307 07:13:48.930103 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.048657 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-combined-ca-bundle\") pod \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.048974 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggvsr\" (UniqueName: \"kubernetes.io/projected/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-kube-api-access-ggvsr\") pod \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.049163 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-httpd-config\") pod \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.049187 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-config\") pod \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.049250 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-ovndb-tls-certs\") pod \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\" (UID: \"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b\") " Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.069641 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" (UID: "b3e4d16b-185a-49c5-b246-a0ed7b0efe9b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.073516 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-kube-api-access-ggvsr" (OuterVolumeSpecName: "kube-api-access-ggvsr") pod "b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" (UID: "b3e4d16b-185a-49c5-b246-a0ed7b0efe9b"). InnerVolumeSpecName "kube-api-access-ggvsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.109605 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-config" (OuterVolumeSpecName: "config") pod "b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" (UID: "b3e4d16b-185a-49c5-b246-a0ed7b0efe9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.134650 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" (UID: "b3e4d16b-185a-49c5-b246-a0ed7b0efe9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.148429 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" (UID: "b3e4d16b-185a-49c5-b246-a0ed7b0efe9b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.154667 4941 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.154706 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.154715 4941 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.154749 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.154758 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggvsr\" (UniqueName: \"kubernetes.io/projected/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b-kube-api-access-ggvsr\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.516126 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57fd5df48d-tt655" event={"ID":"b3e4d16b-185a-49c5-b246-a0ed7b0efe9b","Type":"ContainerDied","Data":"7903c5ea1653c603598d3a936763f53b1b537f74d13149d9ca44f761e5264d30"} Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.516175 4941 scope.go:117] "RemoveContainer" containerID="cfbe514f4242e0ae0b9007e0d67c894bbd9f91c60d5515850b52c12bba198d84" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.516231 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57fd5df48d-tt655" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.556839 4941 scope.go:117] "RemoveContainer" containerID="3629b3f9473c21dc1f60d85a6d35e9abe12bf665705c1e70e6d78451d7d23a5f" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.559823 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57fd5df48d-tt655"] Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.567307 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-57fd5df48d-tt655"] Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.917988 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 07 07:13:49 crc kubenswrapper[4941]: I0307 07:13:49.970283 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" path="/var/lib/kubelet/pods/b3e4d16b-185a-49c5-b246-a0ed7b0efe9b/volumes" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.462199 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5598777fd7-9fgcl"] Mar 07 07:13:50 crc kubenswrapper[4941]: E0307 07:13:50.462576 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" containerName="neutron-api" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.462594 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" containerName="neutron-api" Mar 07 07:13:50 crc kubenswrapper[4941]: E0307 07:13:50.462611 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" containerName="neutron-httpd" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.462620 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" containerName="neutron-httpd" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.462777 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" containerName="neutron-api" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.462807 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3e4d16b-185a-49c5-b246-a0ed7b0efe9b" containerName="neutron-httpd" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.463638 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.477369 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.477742 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.478142 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.486101 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5598777fd7-9fgcl"] Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.577536 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8t2k\" (UniqueName: \"kubernetes.io/projected/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-kube-api-access-x8t2k\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.577604 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-etc-swift\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.577628 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-combined-ca-bundle\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.577657 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-config-data\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.577676 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-public-tls-certs\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.577699 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-internal-tls-certs\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.577729 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-log-httpd\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.577777 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-run-httpd\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.660173 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-665f7c588-lcghc" podUID="68568417-4fd1-4914-8efd-2acbbedc66f9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.169:9311/healthcheck\": read tcp 10.217.0.2:59136->10.217.0.169:9311: read: connection reset by peer" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.660230 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-665f7c588-lcghc" podUID="68568417-4fd1-4914-8efd-2acbbedc66f9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.169:9311/healthcheck\": read tcp 10.217.0.2:59126->10.217.0.169:9311: read: connection reset by peer" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.679680 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-internal-tls-certs\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.679959 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-log-httpd\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.680032 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-run-httpd\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.680099 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8t2k\" (UniqueName: \"kubernetes.io/projected/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-kube-api-access-x8t2k\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.680138 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-etc-swift\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.680160 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-combined-ca-bundle\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.680191 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-config-data\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.680214 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-public-tls-certs\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.684812 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-internal-tls-certs\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.685088 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-log-httpd\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.685113 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-public-tls-certs\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.685293 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-run-httpd\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.691351 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-etc-swift\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.697923 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8t2k\" (UniqueName: \"kubernetes.io/projected/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-kube-api-access-x8t2k\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.701756 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-combined-ca-bundle\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.702468 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-config-data\") pod \"swift-proxy-5598777fd7-9fgcl\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:50 crc kubenswrapper[4941]: I0307 07:13:50.780327 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.153047 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.291794 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-combined-ca-bundle\") pod \"68568417-4fd1-4914-8efd-2acbbedc66f9\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.291866 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-config-data-custom\") pod \"68568417-4fd1-4914-8efd-2acbbedc66f9\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.291920 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-config-data\") pod \"68568417-4fd1-4914-8efd-2acbbedc66f9\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.292006 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68568417-4fd1-4914-8efd-2acbbedc66f9-logs\") pod \"68568417-4fd1-4914-8efd-2acbbedc66f9\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.292048 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb79q\" (UniqueName: \"kubernetes.io/projected/68568417-4fd1-4914-8efd-2acbbedc66f9-kube-api-access-cb79q\") pod \"68568417-4fd1-4914-8efd-2acbbedc66f9\" (UID: \"68568417-4fd1-4914-8efd-2acbbedc66f9\") " Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.294129 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68568417-4fd1-4914-8efd-2acbbedc66f9-logs" (OuterVolumeSpecName: "logs") pod "68568417-4fd1-4914-8efd-2acbbedc66f9" (UID: "68568417-4fd1-4914-8efd-2acbbedc66f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.297476 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68568417-4fd1-4914-8efd-2acbbedc66f9-kube-api-access-cb79q" (OuterVolumeSpecName: "kube-api-access-cb79q") pod "68568417-4fd1-4914-8efd-2acbbedc66f9" (UID: "68568417-4fd1-4914-8efd-2acbbedc66f9"). InnerVolumeSpecName "kube-api-access-cb79q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.297973 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "68568417-4fd1-4914-8efd-2acbbedc66f9" (UID: "68568417-4fd1-4914-8efd-2acbbedc66f9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.326745 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68568417-4fd1-4914-8efd-2acbbedc66f9" (UID: "68568417-4fd1-4914-8efd-2acbbedc66f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.354898 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-config-data" (OuterVolumeSpecName: "config-data") pod "68568417-4fd1-4914-8efd-2acbbedc66f9" (UID: "68568417-4fd1-4914-8efd-2acbbedc66f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.394572 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.394602 4941 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.394612 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68568417-4fd1-4914-8efd-2acbbedc66f9-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.394620 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68568417-4fd1-4914-8efd-2acbbedc66f9-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.394628 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb79q\" (UniqueName: \"kubernetes.io/projected/68568417-4fd1-4914-8efd-2acbbedc66f9-kube-api-access-cb79q\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.430645 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5598777fd7-9fgcl"] Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.587022 4941 generic.go:334] "Generic (PLEG): container finished" podID="68568417-4fd1-4914-8efd-2acbbedc66f9" containerID="d7b7aba73cabcd5439449d884b8172422532ad213ce8fff947418235d94207b9" exitCode=0 Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.587347 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-665f7c588-lcghc" event={"ID":"68568417-4fd1-4914-8efd-2acbbedc66f9","Type":"ContainerDied","Data":"d7b7aba73cabcd5439449d884b8172422532ad213ce8fff947418235d94207b9"} Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.587371 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-665f7c588-lcghc" event={"ID":"68568417-4fd1-4914-8efd-2acbbedc66f9","Type":"ContainerDied","Data":"214f1f5cca2bbf407bc3c2f4c9c59f889a190330dd924884de2e7c380a4384f2"} Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.587467 4941 scope.go:117] "RemoveContainer" containerID="d7b7aba73cabcd5439449d884b8172422532ad213ce8fff947418235d94207b9" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.587635 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-665f7c588-lcghc" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.650454 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-665f7c588-lcghc"] Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.666840 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-665f7c588-lcghc"] Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.835059 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.835832 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="proxy-httpd" containerID="cri-o://a6dabd7977184e5a3593081e782e51ca525e584cff685bf0e23886d6b0f8374b" gracePeriod=30 Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.835955 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="sg-core" containerID="cri-o://5f1b9e84f63b5ec6fa685f6c77e6d290c2fd261231331c2de886282c82884767" gracePeriod=30 Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.836138 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="ceilometer-notification-agent" containerID="cri-o://60317e62b500b74652eac982fea934a674e730ed4d2b573d4ef18c1146ab1cbd" gracePeriod=30 Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.837113 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="ceilometer-central-agent" containerID="cri-o://65a4666ce8e68c7938a912563839662d2d2bdd66f54dbd2cdd96614dc19f52d8" gracePeriod=30 Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.846397 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 07 07:13:51 crc kubenswrapper[4941]: I0307 07:13:51.969797 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68568417-4fd1-4914-8efd-2acbbedc66f9" path="/var/lib/kubelet/pods/68568417-4fd1-4914-8efd-2acbbedc66f9/volumes" Mar 07 07:13:52 crc kubenswrapper[4941]: I0307 07:13:52.601352 4941 generic.go:334] "Generic (PLEG): container finished" podID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerID="a6dabd7977184e5a3593081e782e51ca525e584cff685bf0e23886d6b0f8374b" exitCode=0 Mar 07 07:13:52 crc kubenswrapper[4941]: I0307 07:13:52.601385 4941 generic.go:334] "Generic (PLEG): container finished" podID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerID="5f1b9e84f63b5ec6fa685f6c77e6d290c2fd261231331c2de886282c82884767" exitCode=2 Mar 07 07:13:52 crc kubenswrapper[4941]: I0307 07:13:52.601394 4941 generic.go:334] "Generic (PLEG): container finished" podID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerID="60317e62b500b74652eac982fea934a674e730ed4d2b573d4ef18c1146ab1cbd" exitCode=0 Mar 07 07:13:52 crc kubenswrapper[4941]: I0307 07:13:52.601415 4941 generic.go:334] "Generic (PLEG): container finished" podID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerID="65a4666ce8e68c7938a912563839662d2d2bdd66f54dbd2cdd96614dc19f52d8" exitCode=0 Mar 07 07:13:52 crc kubenswrapper[4941]: I0307 07:13:52.601453 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91dd3a3-9ab6-4183-8287-93b529afcd93","Type":"ContainerDied","Data":"a6dabd7977184e5a3593081e782e51ca525e584cff685bf0e23886d6b0f8374b"} Mar 07 07:13:52 crc kubenswrapper[4941]: I0307 07:13:52.601507 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91dd3a3-9ab6-4183-8287-93b529afcd93","Type":"ContainerDied","Data":"5f1b9e84f63b5ec6fa685f6c77e6d290c2fd261231331c2de886282c82884767"} Mar 07 07:13:52 crc kubenswrapper[4941]: I0307 07:13:52.601521 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91dd3a3-9ab6-4183-8287-93b529afcd93","Type":"ContainerDied","Data":"60317e62b500b74652eac982fea934a674e730ed4d2b573d4ef18c1146ab1cbd"} Mar 07 07:13:52 crc kubenswrapper[4941]: I0307 07:13:52.601532 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91dd3a3-9ab6-4183-8287-93b529afcd93","Type":"ContainerDied","Data":"65a4666ce8e68c7938a912563839662d2d2bdd66f54dbd2cdd96614dc19f52d8"} Mar 07 07:13:53 crc kubenswrapper[4941]: E0307 07:13:53.781455 4941 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c05d887_e05c_4593_a5ad_76be76a9e637.slice/crio-ae27b0b0a7d883e7d199f560fdf2575e3aa824f63defc11b15fd15492f8755c3\": RecentStats: unable to find data in memory cache]" Mar 07 07:13:55 crc kubenswrapper[4941]: I0307 07:13:55.487970 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.165:3000/\": dial tcp 10.217.0.165:3000: connect: connection refused" Mar 07 07:13:56 crc kubenswrapper[4941]: I0307 07:13:56.455374 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:56 crc kubenswrapper[4941]: I0307 07:13:56.483479 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:13:57 crc kubenswrapper[4941]: W0307 07:13:57.553532 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b2f75a4_a46a_4430_bf4d_d3c2c65d8510.slice/crio-bf075523b0769cc58db4e1637f1845844c69fa2ab2dd6316cb6667f9da692f93 WatchSource:0}: Error finding container bf075523b0769cc58db4e1637f1845844c69fa2ab2dd6316cb6667f9da692f93: Status 404 returned error can't find the container with id bf075523b0769cc58db4e1637f1845844c69fa2ab2dd6316cb6667f9da692f93 Mar 07 07:13:57 crc kubenswrapper[4941]: I0307 07:13:57.621731 4941 scope.go:117] "RemoveContainer" containerID="8d80060a277e091c98206a1506d94b681a4f895838a304b5b3f7604c17b5a9c2" Mar 07 07:13:57 crc kubenswrapper[4941]: I0307 07:13:57.671999 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5598777fd7-9fgcl" event={"ID":"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510","Type":"ContainerStarted","Data":"bf075523b0769cc58db4e1637f1845844c69fa2ab2dd6316cb6667f9da692f93"} Mar 07 07:13:57 crc kubenswrapper[4941]: I0307 07:13:57.701940 4941 scope.go:117] "RemoveContainer" containerID="d7b7aba73cabcd5439449d884b8172422532ad213ce8fff947418235d94207b9" Mar 07 07:13:57 crc kubenswrapper[4941]: E0307 07:13:57.702445 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b7aba73cabcd5439449d884b8172422532ad213ce8fff947418235d94207b9\": container with ID starting with d7b7aba73cabcd5439449d884b8172422532ad213ce8fff947418235d94207b9 not found: ID does not exist" containerID="d7b7aba73cabcd5439449d884b8172422532ad213ce8fff947418235d94207b9" Mar 07 07:13:57 crc kubenswrapper[4941]: I0307 07:13:57.702492 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b7aba73cabcd5439449d884b8172422532ad213ce8fff947418235d94207b9"} err="failed to get container status \"d7b7aba73cabcd5439449d884b8172422532ad213ce8fff947418235d94207b9\": rpc error: code = NotFound desc = could not find container \"d7b7aba73cabcd5439449d884b8172422532ad213ce8fff947418235d94207b9\": container with ID starting with d7b7aba73cabcd5439449d884b8172422532ad213ce8fff947418235d94207b9 not found: ID does not exist" Mar 07 07:13:57 crc kubenswrapper[4941]: I0307 07:13:57.702522 4941 scope.go:117] "RemoveContainer" containerID="8d80060a277e091c98206a1506d94b681a4f895838a304b5b3f7604c17b5a9c2" Mar 07 07:13:57 crc kubenswrapper[4941]: E0307 07:13:57.703269 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d80060a277e091c98206a1506d94b681a4f895838a304b5b3f7604c17b5a9c2\": container with ID starting with 8d80060a277e091c98206a1506d94b681a4f895838a304b5b3f7604c17b5a9c2 not found: ID does not exist" containerID="8d80060a277e091c98206a1506d94b681a4f895838a304b5b3f7604c17b5a9c2" Mar 07 07:13:57 crc kubenswrapper[4941]: I0307 07:13:57.703296 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d80060a277e091c98206a1506d94b681a4f895838a304b5b3f7604c17b5a9c2"} err="failed to get container status \"8d80060a277e091c98206a1506d94b681a4f895838a304b5b3f7604c17b5a9c2\": rpc error: code = NotFound desc = could not find container \"8d80060a277e091c98206a1506d94b681a4f895838a304b5b3f7604c17b5a9c2\": container with ID starting with 8d80060a277e091c98206a1506d94b681a4f895838a304b5b3f7604c17b5a9c2 not found: ID does not exist" Mar 07 07:13:57 crc kubenswrapper[4941]: I0307 07:13:57.957538 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.026697 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91dd3a3-9ab6-4183-8287-93b529afcd93-log-httpd\") pod \"e91dd3a3-9ab6-4183-8287-93b529afcd93\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.026761 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-sg-core-conf-yaml\") pod \"e91dd3a3-9ab6-4183-8287-93b529afcd93\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.026858 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-scripts\") pod \"e91dd3a3-9ab6-4183-8287-93b529afcd93\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.026934 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6jsd\" (UniqueName: \"kubernetes.io/projected/e91dd3a3-9ab6-4183-8287-93b529afcd93-kube-api-access-x6jsd\") pod \"e91dd3a3-9ab6-4183-8287-93b529afcd93\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.026966 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-config-data\") pod \"e91dd3a3-9ab6-4183-8287-93b529afcd93\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.026983 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-combined-ca-bundle\") pod \"e91dd3a3-9ab6-4183-8287-93b529afcd93\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.027030 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91dd3a3-9ab6-4183-8287-93b529afcd93-run-httpd\") pod \"e91dd3a3-9ab6-4183-8287-93b529afcd93\" (UID: \"e91dd3a3-9ab6-4183-8287-93b529afcd93\") " Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.028703 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e91dd3a3-9ab6-4183-8287-93b529afcd93-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e91dd3a3-9ab6-4183-8287-93b529afcd93" (UID: "e91dd3a3-9ab6-4183-8287-93b529afcd93"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.029622 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e91dd3a3-9ab6-4183-8287-93b529afcd93-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e91dd3a3-9ab6-4183-8287-93b529afcd93" (UID: "e91dd3a3-9ab6-4183-8287-93b529afcd93"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.035572 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-scripts" (OuterVolumeSpecName: "scripts") pod "e91dd3a3-9ab6-4183-8287-93b529afcd93" (UID: "e91dd3a3-9ab6-4183-8287-93b529afcd93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.039190 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e91dd3a3-9ab6-4183-8287-93b529afcd93-kube-api-access-x6jsd" (OuterVolumeSpecName: "kube-api-access-x6jsd") pod "e91dd3a3-9ab6-4183-8287-93b529afcd93" (UID: "e91dd3a3-9ab6-4183-8287-93b529afcd93"). InnerVolumeSpecName "kube-api-access-x6jsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.074992 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e91dd3a3-9ab6-4183-8287-93b529afcd93" (UID: "e91dd3a3-9ab6-4183-8287-93b529afcd93"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.129091 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6jsd\" (UniqueName: \"kubernetes.io/projected/e91dd3a3-9ab6-4183-8287-93b529afcd93-kube-api-access-x6jsd\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.129119 4941 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91dd3a3-9ab6-4183-8287-93b529afcd93-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.129128 4941 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91dd3a3-9ab6-4183-8287-93b529afcd93-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.129136 4941 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.129144 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.129211 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e91dd3a3-9ab6-4183-8287-93b529afcd93" (UID: "e91dd3a3-9ab6-4183-8287-93b529afcd93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.139476 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-config-data" (OuterVolumeSpecName: "config-data") pod "e91dd3a3-9ab6-4183-8287-93b529afcd93" (UID: "e91dd3a3-9ab6-4183-8287-93b529afcd93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.230915 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.230960 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91dd3a3-9ab6-4183-8287-93b529afcd93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.685091 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5598777fd7-9fgcl" event={"ID":"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510","Type":"ContainerStarted","Data":"cfcc8787654437566381b6642741214a6b8bd5d86a6e869ab2659ad125818269"} Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.685420 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5598777fd7-9fgcl" event={"ID":"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510","Type":"ContainerStarted","Data":"2c0681e10d442ad2f1f0fc4b809613e3cdacd8d007ff20ae0809c479b09094cb"} Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.685463 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.685482 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.689154 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.689636 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91dd3a3-9ab6-4183-8287-93b529afcd93","Type":"ContainerDied","Data":"2d45ad1efacc5b7ee03ab5d48754510fd8ec19599a71274a20c858f917cde2f0"} Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.689802 4941 scope.go:117] "RemoveContainer" containerID="a6dabd7977184e5a3593081e782e51ca525e584cff685bf0e23886d6b0f8374b" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.692601 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b030b241-21f3-48a4-88de-c63abeddccb1","Type":"ContainerStarted","Data":"750b1358dd25ebdb02604816fc6ce9f0509325964bef5a9bdfbf090a38266760"} Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.706683 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5598777fd7-9fgcl" podStartSLOduration=8.706658762 podStartE2EDuration="8.706658762s" podCreationTimestamp="2026-03-07 07:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:13:58.702217689 +0000 UTC m=+1335.654583164" watchObservedRunningTime="2026-03-07 07:13:58.706658762 +0000 UTC m=+1335.659024227" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.716185 4941 scope.go:117] "RemoveContainer" containerID="5f1b9e84f63b5ec6fa685f6c77e6d290c2fd261231331c2de886282c82884767" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.739781 4941 scope.go:117] "RemoveContainer" containerID="60317e62b500b74652eac982fea934a674e730ed4d2b573d4ef18c1146ab1cbd" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.746364 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.767511096 podStartE2EDuration="13.746341049s" podCreationTimestamp="2026-03-07 07:13:45 +0000 UTC" firstStartedPulling="2026-03-07 07:13:46.70665065 +0000 UTC m=+1323.659016115" lastFinishedPulling="2026-03-07 07:13:57.685480603 +0000 UTC m=+1334.637846068" observedRunningTime="2026-03-07 07:13:58.738416238 +0000 UTC m=+1335.690781703" watchObservedRunningTime="2026-03-07 07:13:58.746341049 +0000 UTC m=+1335.698706514" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.767387 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.767887 4941 scope.go:117] "RemoveContainer" containerID="65a4666ce8e68c7938a912563839662d2d2bdd66f54dbd2cdd96614dc19f52d8" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.778255 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.794004 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:13:58 crc kubenswrapper[4941]: E0307 07:13:58.794459 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="sg-core" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.794486 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="sg-core" Mar 07 07:13:58 crc kubenswrapper[4941]: E0307 07:13:58.794508 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68568417-4fd1-4914-8efd-2acbbedc66f9" containerName="barbican-api-log" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.794516 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="68568417-4fd1-4914-8efd-2acbbedc66f9" containerName="barbican-api-log" Mar 07 07:13:58 crc kubenswrapper[4941]: E0307 07:13:58.794529 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="ceilometer-central-agent" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.794537 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="ceilometer-central-agent" Mar 07 07:13:58 crc kubenswrapper[4941]: E0307 07:13:58.794552 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="ceilometer-notification-agent" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.794560 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="ceilometer-notification-agent" Mar 07 07:13:58 crc kubenswrapper[4941]: E0307 07:13:58.794570 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="proxy-httpd" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.794576 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="proxy-httpd" Mar 07 07:13:58 crc kubenswrapper[4941]: E0307 07:13:58.794600 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68568417-4fd1-4914-8efd-2acbbedc66f9" containerName="barbican-api" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.794606 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="68568417-4fd1-4914-8efd-2acbbedc66f9" containerName="barbican-api" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.794794 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="68568417-4fd1-4914-8efd-2acbbedc66f9" containerName="barbican-api-log" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.794806 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="ceilometer-notification-agent" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.794822 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="proxy-httpd" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.794832 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="68568417-4fd1-4914-8efd-2acbbedc66f9" containerName="barbican-api" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.794844 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="ceilometer-central-agent" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.794854 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" containerName="sg-core" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.796461 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.802728 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.806714 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.814516 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.944395 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.944475 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmbmn\" (UniqueName: \"kubernetes.io/projected/609e660e-8750-4956-93dc-c99f891c659f-kube-api-access-hmbmn\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.944520 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/609e660e-8750-4956-93dc-c99f891c659f-log-httpd\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.944538 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/609e660e-8750-4956-93dc-c99f891c659f-run-httpd\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.944824 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-scripts\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.944948 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:58 crc kubenswrapper[4941]: I0307 07:13:58.945003 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-config-data\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.047240 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-config-data\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.047332 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.047387 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmbmn\" (UniqueName: \"kubernetes.io/projected/609e660e-8750-4956-93dc-c99f891c659f-kube-api-access-hmbmn\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.047519 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/609e660e-8750-4956-93dc-c99f891c659f-log-httpd\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.047544 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/609e660e-8750-4956-93dc-c99f891c659f-run-httpd\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.047618 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-scripts\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.047678 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.048013 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/609e660e-8750-4956-93dc-c99f891c659f-log-httpd\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.048126 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/609e660e-8750-4956-93dc-c99f891c659f-run-httpd\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.052554 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.054012 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-scripts\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.055011 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.067566 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-config-data\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.073018 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmbmn\" (UniqueName: \"kubernetes.io/projected/609e660e-8750-4956-93dc-c99f891c659f-kube-api-access-hmbmn\") pod \"ceilometer-0\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.099844 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.100092 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="df120d9b-6b3c-401e-9847-5799a00ccba4" containerName="glance-log" containerID="cri-o://548daa53b60a430ed9122c44f5914f8f99a73a64e60a05630135d619b42368a0" gracePeriod=30 Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.100189 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="df120d9b-6b3c-401e-9847-5799a00ccba4" containerName="glance-httpd" containerID="cri-o://ba7162468c4c70ab71e9a28be05ef6c3bf697ae5de5b59603a39b7e5b0531b86" gracePeriod=30 Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.120278 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.702526 4941 generic.go:334] "Generic (PLEG): container finished" podID="df120d9b-6b3c-401e-9847-5799a00ccba4" containerID="548daa53b60a430ed9122c44f5914f8f99a73a64e60a05630135d619b42368a0" exitCode=143 Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.702655 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df120d9b-6b3c-401e-9847-5799a00ccba4","Type":"ContainerDied","Data":"548daa53b60a430ed9122c44f5914f8f99a73a64e60a05630135d619b42368a0"} Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.755901 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:13:59 crc kubenswrapper[4941]: W0307 07:13:59.758683 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod609e660e_8750_4956_93dc_c99f891c659f.slice/crio-15d93da7ed069d414f8a4e60882644240d4ac90d75a9b71edb2ae343fa35293a WatchSource:0}: Error finding container 15d93da7ed069d414f8a4e60882644240d4ac90d75a9b71edb2ae343fa35293a: Status 404 returned error can't find the container with id 15d93da7ed069d414f8a4e60882644240d4ac90d75a9b71edb2ae343fa35293a Mar 07 07:13:59 crc kubenswrapper[4941]: I0307 07:13:59.966050 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e91dd3a3-9ab6-4183-8287-93b529afcd93" path="/var/lib/kubelet/pods/e91dd3a3-9ab6-4183-8287-93b529afcd93/volumes" Mar 07 07:14:00 crc kubenswrapper[4941]: I0307 07:14:00.138232 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547794-76jnh"] Mar 07 07:14:00 crc kubenswrapper[4941]: I0307 07:14:00.139755 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547794-76jnh" Mar 07 07:14:00 crc kubenswrapper[4941]: I0307 07:14:00.148840 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:14:00 crc kubenswrapper[4941]: I0307 07:14:00.148900 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:14:00 crc kubenswrapper[4941]: I0307 07:14:00.148897 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:14:00 crc kubenswrapper[4941]: I0307 07:14:00.154933 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547794-76jnh"] Mar 07 07:14:00 crc kubenswrapper[4941]: I0307 07:14:00.268876 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v99fk\" (UniqueName: \"kubernetes.io/projected/eb5331d0-b6ce-4295-8b61-ffb8f5425d7a-kube-api-access-v99fk\") pod \"auto-csr-approver-29547794-76jnh\" (UID: \"eb5331d0-b6ce-4295-8b61-ffb8f5425d7a\") " pod="openshift-infra/auto-csr-approver-29547794-76jnh" Mar 07 07:14:00 crc kubenswrapper[4941]: I0307 07:14:00.370665 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v99fk\" (UniqueName: \"kubernetes.io/projected/eb5331d0-b6ce-4295-8b61-ffb8f5425d7a-kube-api-access-v99fk\") pod \"auto-csr-approver-29547794-76jnh\" (UID: \"eb5331d0-b6ce-4295-8b61-ffb8f5425d7a\") " pod="openshift-infra/auto-csr-approver-29547794-76jnh" Mar 07 07:14:00 crc kubenswrapper[4941]: I0307 07:14:00.401002 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v99fk\" (UniqueName: \"kubernetes.io/projected/eb5331d0-b6ce-4295-8b61-ffb8f5425d7a-kube-api-access-v99fk\") pod \"auto-csr-approver-29547794-76jnh\" (UID: \"eb5331d0-b6ce-4295-8b61-ffb8f5425d7a\") " pod="openshift-infra/auto-csr-approver-29547794-76jnh" Mar 07 07:14:00 crc kubenswrapper[4941]: I0307 07:14:00.468105 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547794-76jnh" Mar 07 07:14:00 crc kubenswrapper[4941]: I0307 07:14:00.717949 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"609e660e-8750-4956-93dc-c99f891c659f","Type":"ContainerStarted","Data":"78e90976beb39d9ea7b54f5e247200f49f0375f48e5b8dc8df8dc30b49c4d321"} Mar 07 07:14:00 crc kubenswrapper[4941]: I0307 07:14:00.718243 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"609e660e-8750-4956-93dc-c99f891c659f","Type":"ContainerStarted","Data":"15d93da7ed069d414f8a4e60882644240d4ac90d75a9b71edb2ae343fa35293a"} Mar 07 07:14:00 crc kubenswrapper[4941]: I0307 07:14:00.897656 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547794-76jnh"] Mar 07 07:14:00 crc kubenswrapper[4941]: W0307 07:14:00.902704 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb5331d0_b6ce_4295_8b61_ffb8f5425d7a.slice/crio-a77737db984b7cfd71e59c71fe12f34209ae06fb94754ca9fc96e68194611610 WatchSource:0}: Error finding container a77737db984b7cfd71e59c71fe12f34209ae06fb94754ca9fc96e68194611610: Status 404 returned error can't find the container with id a77737db984b7cfd71e59c71fe12f34209ae06fb94754ca9fc96e68194611610 Mar 07 07:14:01 crc kubenswrapper[4941]: I0307 07:14:01.544144 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:14:01 crc kubenswrapper[4941]: I0307 07:14:01.544975 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e00c9299-d657-4baa-8381-feb1a099f6f3" containerName="glance-log" containerID="cri-o://00b8ba1aafedcbae78e907fe0a3b46e0598930393958c65057796c47d5f40bd3" gracePeriod=30 Mar 07 07:14:01 crc kubenswrapper[4941]: I0307 07:14:01.545058 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e00c9299-d657-4baa-8381-feb1a099f6f3" containerName="glance-httpd" containerID="cri-o://34f9181ce1397e6679e1fcdcec34b8497b19db5025642f91b4b8ab393ba9d654" gracePeriod=30 Mar 07 07:14:01 crc kubenswrapper[4941]: I0307 07:14:01.752201 4941 generic.go:334] "Generic (PLEG): container finished" podID="e00c9299-d657-4baa-8381-feb1a099f6f3" containerID="00b8ba1aafedcbae78e907fe0a3b46e0598930393958c65057796c47d5f40bd3" exitCode=143 Mar 07 07:14:01 crc kubenswrapper[4941]: I0307 07:14:01.752317 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e00c9299-d657-4baa-8381-feb1a099f6f3","Type":"ContainerDied","Data":"00b8ba1aafedcbae78e907fe0a3b46e0598930393958c65057796c47d5f40bd3"} Mar 07 07:14:01 crc kubenswrapper[4941]: I0307 07:14:01.759547 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"609e660e-8750-4956-93dc-c99f891c659f","Type":"ContainerStarted","Data":"6c3739b7f2a5e11c0e104ee916b474288fcc816263b755ea9a372604afda32c6"} Mar 07 07:14:01 crc kubenswrapper[4941]: I0307 07:14:01.761137 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547794-76jnh" event={"ID":"eb5331d0-b6ce-4295-8b61-ffb8f5425d7a","Type":"ContainerStarted","Data":"a77737db984b7cfd71e59c71fe12f34209ae06fb94754ca9fc96e68194611610"} Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.675817 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.784133 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"609e660e-8750-4956-93dc-c99f891c659f","Type":"ContainerStarted","Data":"ae7b9ebd5e5aabacc30e4c60737428ec243f2b66eb09db8c78c52e33bf4b18d3"} Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.793575 4941 generic.go:334] "Generic (PLEG): container finished" podID="eb5331d0-b6ce-4295-8b61-ffb8f5425d7a" containerID="e479eac33d253a468fe385bde1c5fe732f9ef0a1dd4b506f889b495235a2a7db" exitCode=0 Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.793702 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547794-76jnh" event={"ID":"eb5331d0-b6ce-4295-8b61-ffb8f5425d7a","Type":"ContainerDied","Data":"e479eac33d253a468fe385bde1c5fe732f9ef0a1dd4b506f889b495235a2a7db"} Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.797631 4941 generic.go:334] "Generic (PLEG): container finished" podID="df120d9b-6b3c-401e-9847-5799a00ccba4" containerID="ba7162468c4c70ab71e9a28be05ef6c3bf697ae5de5b59603a39b7e5b0531b86" exitCode=0 Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.797673 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df120d9b-6b3c-401e-9847-5799a00ccba4","Type":"ContainerDied","Data":"ba7162468c4c70ab71e9a28be05ef6c3bf697ae5de5b59603a39b7e5b0531b86"} Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.797695 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df120d9b-6b3c-401e-9847-5799a00ccba4","Type":"ContainerDied","Data":"af029582dc89509ce621f7f9fa4c7eba8871dd48fb81a128ce473d1dd677e82a"} Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.797706 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af029582dc89509ce621f7f9fa4c7eba8871dd48fb81a128ce473d1dd677e82a" Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.820744 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.918627 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df120d9b-6b3c-401e-9847-5799a00ccba4-httpd-run\") pod \"df120d9b-6b3c-401e-9847-5799a00ccba4\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.918724 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-scripts\") pod \"df120d9b-6b3c-401e-9847-5799a00ccba4\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.918752 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-public-tls-certs\") pod \"df120d9b-6b3c-401e-9847-5799a00ccba4\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.918799 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"df120d9b-6b3c-401e-9847-5799a00ccba4\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.918830 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kp68\" (UniqueName: \"kubernetes.io/projected/df120d9b-6b3c-401e-9847-5799a00ccba4-kube-api-access-6kp68\") pod \"df120d9b-6b3c-401e-9847-5799a00ccba4\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.918866 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-config-data\") pod \"df120d9b-6b3c-401e-9847-5799a00ccba4\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.918941 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df120d9b-6b3c-401e-9847-5799a00ccba4-logs\") pod \"df120d9b-6b3c-401e-9847-5799a00ccba4\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.919118 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-combined-ca-bundle\") pod \"df120d9b-6b3c-401e-9847-5799a00ccba4\" (UID: \"df120d9b-6b3c-401e-9847-5799a00ccba4\") " Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.919557 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df120d9b-6b3c-401e-9847-5799a00ccba4-logs" (OuterVolumeSpecName: "logs") pod "df120d9b-6b3c-401e-9847-5799a00ccba4" (UID: "df120d9b-6b3c-401e-9847-5799a00ccba4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.919255 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df120d9b-6b3c-401e-9847-5799a00ccba4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "df120d9b-6b3c-401e-9847-5799a00ccba4" (UID: "df120d9b-6b3c-401e-9847-5799a00ccba4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.924615 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-scripts" (OuterVolumeSpecName: "scripts") pod "df120d9b-6b3c-401e-9847-5799a00ccba4" (UID: "df120d9b-6b3c-401e-9847-5799a00ccba4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.926645 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df120d9b-6b3c-401e-9847-5799a00ccba4-kube-api-access-6kp68" (OuterVolumeSpecName: "kube-api-access-6kp68") pod "df120d9b-6b3c-401e-9847-5799a00ccba4" (UID: "df120d9b-6b3c-401e-9847-5799a00ccba4"). InnerVolumeSpecName "kube-api-access-6kp68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.931490 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "df120d9b-6b3c-401e-9847-5799a00ccba4" (UID: "df120d9b-6b3c-401e-9847-5799a00ccba4"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.952311 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df120d9b-6b3c-401e-9847-5799a00ccba4" (UID: "df120d9b-6b3c-401e-9847-5799a00ccba4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:02 crc kubenswrapper[4941]: I0307 07:14:02.987996 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-config-data" (OuterVolumeSpecName: "config-data") pod "df120d9b-6b3c-401e-9847-5799a00ccba4" (UID: "df120d9b-6b3c-401e-9847-5799a00ccba4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.013956 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "df120d9b-6b3c-401e-9847-5799a00ccba4" (UID: "df120d9b-6b3c-401e-9847-5799a00ccba4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.021866 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.021942 4941 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df120d9b-6b3c-401e-9847-5799a00ccba4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.021957 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.021969 4941 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.022018 4941 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.022029 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kp68\" (UniqueName: \"kubernetes.io/projected/df120d9b-6b3c-401e-9847-5799a00ccba4-kube-api-access-6kp68\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.022039 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df120d9b-6b3c-401e-9847-5799a00ccba4-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.022048 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df120d9b-6b3c-401e-9847-5799a00ccba4-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.046288 4941 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.123804 4941 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.808121 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"609e660e-8750-4956-93dc-c99f891c659f","Type":"ContainerStarted","Data":"ca7bfc2e1bf2b29c667ade5852c57a363a583198606dffe5a8480cc5c981945a"} Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.808163 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.808522 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="ceilometer-notification-agent" containerID="cri-o://6c3739b7f2a5e11c0e104ee916b474288fcc816263b755ea9a372604afda32c6" gracePeriod=30 Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.808597 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="sg-core" containerID="cri-o://ae7b9ebd5e5aabacc30e4c60737428ec243f2b66eb09db8c78c52e33bf4b18d3" gracePeriod=30 Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.808306 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="ceilometer-central-agent" containerID="cri-o://78e90976beb39d9ea7b54f5e247200f49f0375f48e5b8dc8df8dc30b49c4d321" gracePeriod=30 Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.808314 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="proxy-httpd" containerID="cri-o://ca7bfc2e1bf2b29c667ade5852c57a363a583198606dffe5a8480cc5c981945a" gracePeriod=30 Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.835203 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.291487115 podStartE2EDuration="5.835185637s" podCreationTimestamp="2026-03-07 07:13:58 +0000 UTC" firstStartedPulling="2026-03-07 07:13:59.761472953 +0000 UTC m=+1336.713838418" lastFinishedPulling="2026-03-07 07:14:03.305171475 +0000 UTC m=+1340.257536940" observedRunningTime="2026-03-07 07:14:03.833573296 +0000 UTC m=+1340.785938761" watchObservedRunningTime="2026-03-07 07:14:03.835185637 +0000 UTC m=+1340.787551102" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.876686 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.903347 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.980037 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df120d9b-6b3c-401e-9847-5799a00ccba4" path="/var/lib/kubelet/pods/df120d9b-6b3c-401e-9847-5799a00ccba4/volumes" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.980987 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:14:03 crc kubenswrapper[4941]: E0307 07:14:03.981310 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df120d9b-6b3c-401e-9847-5799a00ccba4" containerName="glance-httpd" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.981329 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="df120d9b-6b3c-401e-9847-5799a00ccba4" containerName="glance-httpd" Mar 07 07:14:03 crc kubenswrapper[4941]: E0307 07:14:03.981357 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df120d9b-6b3c-401e-9847-5799a00ccba4" containerName="glance-log" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.981366 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="df120d9b-6b3c-401e-9847-5799a00ccba4" containerName="glance-log" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.981591 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="df120d9b-6b3c-401e-9847-5799a00ccba4" containerName="glance-log" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.981612 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="df120d9b-6b3c-401e-9847-5799a00ccba4" containerName="glance-httpd" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.982369 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.982728 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.986775 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 07 07:14:03 crc kubenswrapper[4941]: I0307 07:14:03.987120 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.038781 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vft9m\" (UniqueName: \"kubernetes.io/projected/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-kube-api-access-vft9m\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.038820 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.038854 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.038928 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-logs\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.038984 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.039002 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.039204 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.039261 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: E0307 07:14:04.111703 4941 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf120d9b_6b3c_401e_9847_5799a00ccba4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf120d9b_6b3c_401e_9847_5799a00ccba4.slice/crio-af029582dc89509ce621f7f9fa4c7eba8871dd48fb81a128ce473d1dd677e82a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod609e660e_8750_4956_93dc_c99f891c659f.slice/crio-conmon-ae7b9ebd5e5aabacc30e4c60737428ec243f2b66eb09db8c78c52e33bf4b18d3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c05d887_e05c_4593_a5ad_76be76a9e637.slice/crio-ae27b0b0a7d883e7d199f560fdf2575e3aa824f63defc11b15fd15492f8755c3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod609e660e_8750_4956_93dc_c99f891c659f.slice/crio-ae7b9ebd5e5aabacc30e4c60737428ec243f2b66eb09db8c78c52e33bf4b18d3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod609e660e_8750_4956_93dc_c99f891c659f.slice/crio-conmon-ca7bfc2e1bf2b29c667ade5852c57a363a583198606dffe5a8480cc5c981945a.scope\": RecentStats: unable to find data in memory cache]" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.140634 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.140693 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.140752 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vft9m\" (UniqueName: \"kubernetes.io/projected/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-kube-api-access-vft9m\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.140772 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.140799 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.140855 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-logs\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.140906 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.140925 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.142450 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-logs\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.142739 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.143656 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.149207 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.150986 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.151095 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.164193 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vft9m\" (UniqueName: \"kubernetes.io/projected/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-kube-api-access-vft9m\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.184728 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.193384 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.217839 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547794-76jnh" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.242296 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v99fk\" (UniqueName: \"kubernetes.io/projected/eb5331d0-b6ce-4295-8b61-ffb8f5425d7a-kube-api-access-v99fk\") pod \"eb5331d0-b6ce-4295-8b61-ffb8f5425d7a\" (UID: \"eb5331d0-b6ce-4295-8b61-ffb8f5425d7a\") " Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.251805 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb5331d0-b6ce-4295-8b61-ffb8f5425d7a-kube-api-access-v99fk" (OuterVolumeSpecName: "kube-api-access-v99fk") pod "eb5331d0-b6ce-4295-8b61-ffb8f5425d7a" (UID: "eb5331d0-b6ce-4295-8b61-ffb8f5425d7a"). InnerVolumeSpecName "kube-api-access-v99fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.320307 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.344689 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v99fk\" (UniqueName: \"kubernetes.io/projected/eb5331d0-b6ce-4295-8b61-ffb8f5425d7a-kube-api-access-v99fk\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.822450 4941 generic.go:334] "Generic (PLEG): container finished" podID="609e660e-8750-4956-93dc-c99f891c659f" containerID="ca7bfc2e1bf2b29c667ade5852c57a363a583198606dffe5a8480cc5c981945a" exitCode=0 Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.822699 4941 generic.go:334] "Generic (PLEG): container finished" podID="609e660e-8750-4956-93dc-c99f891c659f" containerID="ae7b9ebd5e5aabacc30e4c60737428ec243f2b66eb09db8c78c52e33bf4b18d3" exitCode=2 Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.822708 4941 generic.go:334] "Generic (PLEG): container finished" podID="609e660e-8750-4956-93dc-c99f891c659f" containerID="6c3739b7f2a5e11c0e104ee916b474288fcc816263b755ea9a372604afda32c6" exitCode=0 Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.822515 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"609e660e-8750-4956-93dc-c99f891c659f","Type":"ContainerDied","Data":"ca7bfc2e1bf2b29c667ade5852c57a363a583198606dffe5a8480cc5c981945a"} Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.822807 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"609e660e-8750-4956-93dc-c99f891c659f","Type":"ContainerDied","Data":"ae7b9ebd5e5aabacc30e4c60737428ec243f2b66eb09db8c78c52e33bf4b18d3"} Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.822823 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"609e660e-8750-4956-93dc-c99f891c659f","Type":"ContainerDied","Data":"6c3739b7f2a5e11c0e104ee916b474288fcc816263b755ea9a372604afda32c6"} Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.829248 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547794-76jnh" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.829236 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547794-76jnh" event={"ID":"eb5331d0-b6ce-4295-8b61-ffb8f5425d7a","Type":"ContainerDied","Data":"a77737db984b7cfd71e59c71fe12f34209ae06fb94754ca9fc96e68194611610"} Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.829413 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a77737db984b7cfd71e59c71fe12f34209ae06fb94754ca9fc96e68194611610" Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.838236 4941 generic.go:334] "Generic (PLEG): container finished" podID="e00c9299-d657-4baa-8381-feb1a099f6f3" containerID="34f9181ce1397e6679e1fcdcec34b8497b19db5025642f91b4b8ab393ba9d654" exitCode=0 Mar 07 07:14:04 crc kubenswrapper[4941]: I0307 07:14:04.838275 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e00c9299-d657-4baa-8381-feb1a099f6f3","Type":"ContainerDied","Data":"34f9181ce1397e6679e1fcdcec34b8497b19db5025642f91b4b8ab393ba9d654"} Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.221054 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.267490 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00c9299-d657-4baa-8381-feb1a099f6f3-logs\") pod \"e00c9299-d657-4baa-8381-feb1a099f6f3\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.267532 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-config-data\") pod \"e00c9299-d657-4baa-8381-feb1a099f6f3\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.267652 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-combined-ca-bundle\") pod \"e00c9299-d657-4baa-8381-feb1a099f6f3\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.267683 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e00c9299-d657-4baa-8381-feb1a099f6f3-httpd-run\") pod \"e00c9299-d657-4baa-8381-feb1a099f6f3\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.267748 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-internal-tls-certs\") pod \"e00c9299-d657-4baa-8381-feb1a099f6f3\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.267817 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e00c9299-d657-4baa-8381-feb1a099f6f3\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.267840 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6zgt\" (UniqueName: \"kubernetes.io/projected/e00c9299-d657-4baa-8381-feb1a099f6f3-kube-api-access-r6zgt\") pod \"e00c9299-d657-4baa-8381-feb1a099f6f3\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.267856 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-scripts\") pod \"e00c9299-d657-4baa-8381-feb1a099f6f3\" (UID: \"e00c9299-d657-4baa-8381-feb1a099f6f3\") " Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.270097 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e00c9299-d657-4baa-8381-feb1a099f6f3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e00c9299-d657-4baa-8381-feb1a099f6f3" (UID: "e00c9299-d657-4baa-8381-feb1a099f6f3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.270478 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e00c9299-d657-4baa-8381-feb1a099f6f3-logs" (OuterVolumeSpecName: "logs") pod "e00c9299-d657-4baa-8381-feb1a099f6f3" (UID: "e00c9299-d657-4baa-8381-feb1a099f6f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.279630 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00c9299-d657-4baa-8381-feb1a099f6f3-kube-api-access-r6zgt" (OuterVolumeSpecName: "kube-api-access-r6zgt") pod "e00c9299-d657-4baa-8381-feb1a099f6f3" (UID: "e00c9299-d657-4baa-8381-feb1a099f6f3"). InnerVolumeSpecName "kube-api-access-r6zgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.288485 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "e00c9299-d657-4baa-8381-feb1a099f6f3" (UID: "e00c9299-d657-4baa-8381-feb1a099f6f3"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.288501 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-scripts" (OuterVolumeSpecName: "scripts") pod "e00c9299-d657-4baa-8381-feb1a099f6f3" (UID: "e00c9299-d657-4baa-8381-feb1a099f6f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.311782 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547788-8s6c7"] Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.318287 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e00c9299-d657-4baa-8381-feb1a099f6f3" (UID: "e00c9299-d657-4baa-8381-feb1a099f6f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.319127 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547788-8s6c7"] Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.342812 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-config-data" (OuterVolumeSpecName: "config-data") pod "e00c9299-d657-4baa-8381-feb1a099f6f3" (UID: "e00c9299-d657-4baa-8381-feb1a099f6f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.356523 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e00c9299-d657-4baa-8381-feb1a099f6f3" (UID: "e00c9299-d657-4baa-8381-feb1a099f6f3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.369242 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.369270 4941 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e00c9299-d657-4baa-8381-feb1a099f6f3-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.369280 4941 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.369308 4941 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.369318 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6zgt\" (UniqueName: \"kubernetes.io/projected/e00c9299-d657-4baa-8381-feb1a099f6f3-kube-api-access-r6zgt\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.369328 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.369336 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e00c9299-d657-4baa-8381-feb1a099f6f3-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.369344 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00c9299-d657-4baa-8381-feb1a099f6f3-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.389290 4941 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.471169 4941 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.636887 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:14:05 crc kubenswrapper[4941]: W0307 07:14:05.652612 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod927b9eb0_124f_4a2c_86ae_2ea4cbe609e7.slice/crio-7090c3e128fde91da862c0961652bb0c7d813f6b4403bdd78912771666681ca1 WatchSource:0}: Error finding container 7090c3e128fde91da862c0961652bb0c7d813f6b4403bdd78912771666681ca1: Status 404 returned error can't find the container with id 7090c3e128fde91da862c0961652bb0c7d813f6b4403bdd78912771666681ca1 Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.791259 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.795032 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.867705 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7","Type":"ContainerStarted","Data":"7090c3e128fde91da862c0961652bb0c7d813f6b4403bdd78912771666681ca1"} Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.871529 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e00c9299-d657-4baa-8381-feb1a099f6f3","Type":"ContainerDied","Data":"60d5da6a07923d1f3a807878bf7c9e2417646acdb587caca76b524914114106e"} Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.871642 4941 scope.go:117] "RemoveContainer" containerID="34f9181ce1397e6679e1fcdcec34b8497b19db5025642f91b4b8ab393ba9d654" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.871754 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.919770 4941 scope.go:117] "RemoveContainer" containerID="00b8ba1aafedcbae78e907fe0a3b46e0598930393958c65057796c47d5f40bd3" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.939591 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.947605 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.967885 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f408644-d7ae-43ad-a056-fbe07aca78c1" path="/var/lib/kubelet/pods/6f408644-d7ae-43ad-a056-fbe07aca78c1/volumes" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.968737 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00c9299-d657-4baa-8381-feb1a099f6f3" path="/var/lib/kubelet/pods/e00c9299-d657-4baa-8381-feb1a099f6f3/volumes" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.969300 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:14:05 crc kubenswrapper[4941]: E0307 07:14:05.969581 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00c9299-d657-4baa-8381-feb1a099f6f3" containerName="glance-httpd" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.969596 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00c9299-d657-4baa-8381-feb1a099f6f3" containerName="glance-httpd" Mar 07 07:14:05 crc kubenswrapper[4941]: E0307 07:14:05.969616 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb5331d0-b6ce-4295-8b61-ffb8f5425d7a" containerName="oc" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.969622 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb5331d0-b6ce-4295-8b61-ffb8f5425d7a" containerName="oc" Mar 07 07:14:05 crc kubenswrapper[4941]: E0307 07:14:05.969638 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00c9299-d657-4baa-8381-feb1a099f6f3" containerName="glance-log" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.969644 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00c9299-d657-4baa-8381-feb1a099f6f3" containerName="glance-log" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.969801 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00c9299-d657-4baa-8381-feb1a099f6f3" containerName="glance-log" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.969811 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb5331d0-b6ce-4295-8b61-ffb8f5425d7a" containerName="oc" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.969827 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00c9299-d657-4baa-8381-feb1a099f6f3" containerName="glance-httpd" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.970584 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.970692 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.973849 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 07 07:14:05 crc kubenswrapper[4941]: I0307 07:14:05.974056 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.087937 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.088011 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.088028 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncks6\" (UniqueName: \"kubernetes.io/projected/e6d72c12-422e-48fd-b56b-8344260e3e01-kube-api-access-ncks6\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.088081 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d72c12-422e-48fd-b56b-8344260e3e01-logs\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.088159 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.088232 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6d72c12-422e-48fd-b56b-8344260e3e01-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.088356 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.088580 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.190150 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.190213 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6d72c12-422e-48fd-b56b-8344260e3e01-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.190266 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.190299 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.190344 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.190367 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.190384 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncks6\" (UniqueName: \"kubernetes.io/projected/e6d72c12-422e-48fd-b56b-8344260e3e01-kube-api-access-ncks6\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.190429 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d72c12-422e-48fd-b56b-8344260e3e01-logs\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.190848 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6d72c12-422e-48fd-b56b-8344260e3e01-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.190929 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d72c12-422e-48fd-b56b-8344260e3e01-logs\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.191468 4941 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.199635 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.201311 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.201342 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.202292 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.220122 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncks6\" (UniqueName: \"kubernetes.io/projected/e6d72c12-422e-48fd-b56b-8344260e3e01-kube-api-access-ncks6\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.237014 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.332781 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.863638 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:14:06 crc kubenswrapper[4941]: W0307 07:14:06.878779 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6d72c12_422e_48fd_b56b_8344260e3e01.slice/crio-98678e2d19eceae08efb60bdac32c0da6191c50324dd4cc14b7dbf4a21896eda WatchSource:0}: Error finding container 98678e2d19eceae08efb60bdac32c0da6191c50324dd4cc14b7dbf4a21896eda: Status 404 returned error can't find the container with id 98678e2d19eceae08efb60bdac32c0da6191c50324dd4cc14b7dbf4a21896eda Mar 07 07:14:06 crc kubenswrapper[4941]: I0307 07:14:06.880441 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7","Type":"ContainerStarted","Data":"b048a7900d858225d4830b842a8b3ed5f22a79a6146d7e6677222d62c98ef913"} Mar 07 07:14:07 crc kubenswrapper[4941]: I0307 07:14:07.893784 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7","Type":"ContainerStarted","Data":"885046a8a3875aba6340cf20fd6d156d9145acc04deb00a3dc3da2bb74d33aa4"} Mar 07 07:14:07 crc kubenswrapper[4941]: I0307 07:14:07.898312 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e6d72c12-422e-48fd-b56b-8344260e3e01","Type":"ContainerStarted","Data":"cee97226fd2fe2196abc3b2a74e837c3f4b06a16e8e4628edcae67b979c11f70"} Mar 07 07:14:07 crc kubenswrapper[4941]: I0307 07:14:07.898353 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e6d72c12-422e-48fd-b56b-8344260e3e01","Type":"ContainerStarted","Data":"98678e2d19eceae08efb60bdac32c0da6191c50324dd4cc14b7dbf4a21896eda"} Mar 07 07:14:08 crc kubenswrapper[4941]: I0307 07:14:08.910129 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e6d72c12-422e-48fd-b56b-8344260e3e01","Type":"ContainerStarted","Data":"ed43789861becd87eee81a4232f20de6afb6f8198fc9dd762f6924dee8e81bc0"} Mar 07 07:14:08 crc kubenswrapper[4941]: I0307 07:14:08.936666 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.936648628 podStartE2EDuration="3.936648628s" podCreationTimestamp="2026-03-07 07:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:14:08.931142608 +0000 UTC m=+1345.883508083" watchObservedRunningTime="2026-03-07 07:14:08.936648628 +0000 UTC m=+1345.889014093" Mar 07 07:14:08 crc kubenswrapper[4941]: I0307 07:14:08.938349 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.938343421 podStartE2EDuration="5.938343421s" podCreationTimestamp="2026-03-07 07:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:14:07.923614186 +0000 UTC m=+1344.875979661" watchObservedRunningTime="2026-03-07 07:14:08.938343421 +0000 UTC m=+1345.890708886" Mar 07 07:14:10 crc kubenswrapper[4941]: I0307 07:14:10.313981 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:14:10 crc kubenswrapper[4941]: I0307 07:14:10.314338 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:14:11 crc kubenswrapper[4941]: I0307 07:14:11.938359 4941 generic.go:334] "Generic (PLEG): container finished" podID="609e660e-8750-4956-93dc-c99f891c659f" containerID="78e90976beb39d9ea7b54f5e247200f49f0375f48e5b8dc8df8dc30b49c4d321" exitCode=0 Mar 07 07:14:11 crc kubenswrapper[4941]: I0307 07:14:11.938419 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"609e660e-8750-4956-93dc-c99f891c659f","Type":"ContainerDied","Data":"78e90976beb39d9ea7b54f5e247200f49f0375f48e5b8dc8df8dc30b49c4d321"} Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.253161 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-rnqcz"] Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.258888 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rnqcz" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.264686 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rnqcz"] Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.337476 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.352212 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mc6xs"] Mar 07 07:14:12 crc kubenswrapper[4941]: E0307 07:14:12.352760 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="proxy-httpd" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.352789 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="proxy-httpd" Mar 07 07:14:12 crc kubenswrapper[4941]: E0307 07:14:12.352815 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="sg-core" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.352823 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="sg-core" Mar 07 07:14:12 crc kubenswrapper[4941]: E0307 07:14:12.352858 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="ceilometer-notification-agent" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.352867 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="ceilometer-notification-agent" Mar 07 07:14:12 crc kubenswrapper[4941]: E0307 07:14:12.352883 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="ceilometer-central-agent" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.352891 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="ceilometer-central-agent" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.353101 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="ceilometer-central-agent" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.353127 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="ceilometer-notification-agent" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.353136 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="proxy-httpd" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.353159 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="609e660e-8750-4956-93dc-c99f891c659f" containerName="sg-core" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.353833 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mc6xs" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.359952 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mc6xs"] Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.375315 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-427c-account-create-update-fzfss"] Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.376361 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-427c-account-create-update-fzfss" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.385763 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.398829 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-427c-account-create-update-fzfss"] Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.402443 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a-operator-scripts\") pod \"nova-api-db-create-rnqcz\" (UID: \"1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a\") " pod="openstack/nova-api-db-create-rnqcz" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.402625 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8whrl\" (UniqueName: \"kubernetes.io/projected/1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a-kube-api-access-8whrl\") pod \"nova-api-db-create-rnqcz\" (UID: \"1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a\") " pod="openstack/nova-api-db-create-rnqcz" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.457168 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-fkcb9"] Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.458241 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fkcb9" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.464981 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fkcb9"] Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.504197 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/609e660e-8750-4956-93dc-c99f891c659f-log-httpd\") pod \"609e660e-8750-4956-93dc-c99f891c659f\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.504268 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmbmn\" (UniqueName: \"kubernetes.io/projected/609e660e-8750-4956-93dc-c99f891c659f-kube-api-access-hmbmn\") pod \"609e660e-8750-4956-93dc-c99f891c659f\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.504340 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-config-data\") pod \"609e660e-8750-4956-93dc-c99f891c659f\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.504388 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-scripts\") pod \"609e660e-8750-4956-93dc-c99f891c659f\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.504428 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-sg-core-conf-yaml\") pod \"609e660e-8750-4956-93dc-c99f891c659f\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.504455 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-combined-ca-bundle\") pod \"609e660e-8750-4956-93dc-c99f891c659f\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.504502 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/609e660e-8750-4956-93dc-c99f891c659f-run-httpd\") pod \"609e660e-8750-4956-93dc-c99f891c659f\" (UID: \"609e660e-8750-4956-93dc-c99f891c659f\") " Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.504663 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/609e660e-8750-4956-93dc-c99f891c659f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "609e660e-8750-4956-93dc-c99f891c659f" (UID: "609e660e-8750-4956-93dc-c99f891c659f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.504677 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8whrl\" (UniqueName: \"kubernetes.io/projected/1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a-kube-api-access-8whrl\") pod \"nova-api-db-create-rnqcz\" (UID: \"1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a\") " pod="openstack/nova-api-db-create-rnqcz" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.504937 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a-operator-scripts\") pod \"nova-api-db-create-rnqcz\" (UID: \"1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a\") " pod="openstack/nova-api-db-create-rnqcz" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.505047 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/930d2540-f85c-425f-8750-75ceb6d183b7-operator-scripts\") pod \"nova-api-427c-account-create-update-fzfss\" (UID: \"930d2540-f85c-425f-8750-75ceb6d183b7\") " pod="openstack/nova-api-427c-account-create-update-fzfss" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.505088 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj7zf\" (UniqueName: \"kubernetes.io/projected/53f6e27d-9523-4815-9efe-bf92df44ae37-kube-api-access-sj7zf\") pod \"nova-cell0-db-create-mc6xs\" (UID: \"53f6e27d-9523-4815-9efe-bf92df44ae37\") " pod="openstack/nova-cell0-db-create-mc6xs" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.505169 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53f6e27d-9523-4815-9efe-bf92df44ae37-operator-scripts\") pod \"nova-cell0-db-create-mc6xs\" (UID: \"53f6e27d-9523-4815-9efe-bf92df44ae37\") " pod="openstack/nova-cell0-db-create-mc6xs" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.505229 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt8bp\" (UniqueName: \"kubernetes.io/projected/930d2540-f85c-425f-8750-75ceb6d183b7-kube-api-access-jt8bp\") pod \"nova-api-427c-account-create-update-fzfss\" (UID: \"930d2540-f85c-425f-8750-75ceb6d183b7\") " pod="openstack/nova-api-427c-account-create-update-fzfss" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.505333 4941 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/609e660e-8750-4956-93dc-c99f891c659f-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.505943 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a-operator-scripts\") pod \"nova-api-db-create-rnqcz\" (UID: \"1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a\") " pod="openstack/nova-api-db-create-rnqcz" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.506694 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/609e660e-8750-4956-93dc-c99f891c659f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "609e660e-8750-4956-93dc-c99f891c659f" (UID: "609e660e-8750-4956-93dc-c99f891c659f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.512232 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/609e660e-8750-4956-93dc-c99f891c659f-kube-api-access-hmbmn" (OuterVolumeSpecName: "kube-api-access-hmbmn") pod "609e660e-8750-4956-93dc-c99f891c659f" (UID: "609e660e-8750-4956-93dc-c99f891c659f"). InnerVolumeSpecName "kube-api-access-hmbmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.527508 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8whrl\" (UniqueName: \"kubernetes.io/projected/1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a-kube-api-access-8whrl\") pod \"nova-api-db-create-rnqcz\" (UID: \"1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a\") " pod="openstack/nova-api-db-create-rnqcz" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.538199 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-scripts" (OuterVolumeSpecName: "scripts") pod "609e660e-8750-4956-93dc-c99f891c659f" (UID: "609e660e-8750-4956-93dc-c99f891c659f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.555193 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-3b99-account-create-update-mr8p7"] Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.556347 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b99-account-create-update-mr8p7" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.563352 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.572534 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "609e660e-8750-4956-93dc-c99f891c659f" (UID: "609e660e-8750-4956-93dc-c99f891c659f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.594036 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3b99-account-create-update-mr8p7"] Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.607235 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/930d2540-f85c-425f-8750-75ceb6d183b7-operator-scripts\") pod \"nova-api-427c-account-create-update-fzfss\" (UID: \"930d2540-f85c-425f-8750-75ceb6d183b7\") " pod="openstack/nova-api-427c-account-create-update-fzfss" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.607291 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj7zf\" (UniqueName: \"kubernetes.io/projected/53f6e27d-9523-4815-9efe-bf92df44ae37-kube-api-access-sj7zf\") pod \"nova-cell0-db-create-mc6xs\" (UID: \"53f6e27d-9523-4815-9efe-bf92df44ae37\") " pod="openstack/nova-cell0-db-create-mc6xs" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.607352 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53f6e27d-9523-4815-9efe-bf92df44ae37-operator-scripts\") pod \"nova-cell0-db-create-mc6xs\" (UID: \"53f6e27d-9523-4815-9efe-bf92df44ae37\") " pod="openstack/nova-cell0-db-create-mc6xs" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.607396 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt8bp\" (UniqueName: \"kubernetes.io/projected/930d2540-f85c-425f-8750-75ceb6d183b7-kube-api-access-jt8bp\") pod \"nova-api-427c-account-create-update-fzfss\" (UID: \"930d2540-f85c-425f-8750-75ceb6d183b7\") " pod="openstack/nova-api-427c-account-create-update-fzfss" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.607543 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39-operator-scripts\") pod \"nova-cell1-db-create-fkcb9\" (UID: \"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39\") " pod="openstack/nova-cell1-db-create-fkcb9" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.607579 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhpph\" (UniqueName: \"kubernetes.io/projected/4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39-kube-api-access-qhpph\") pod \"nova-cell1-db-create-fkcb9\" (UID: \"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39\") " pod="openstack/nova-cell1-db-create-fkcb9" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.607644 4941 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.607660 4941 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/609e660e-8750-4956-93dc-c99f891c659f-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.607672 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmbmn\" (UniqueName: \"kubernetes.io/projected/609e660e-8750-4956-93dc-c99f891c659f-kube-api-access-hmbmn\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.607685 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.608472 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/930d2540-f85c-425f-8750-75ceb6d183b7-operator-scripts\") pod \"nova-api-427c-account-create-update-fzfss\" (UID: \"930d2540-f85c-425f-8750-75ceb6d183b7\") " pod="openstack/nova-api-427c-account-create-update-fzfss" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.609632 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53f6e27d-9523-4815-9efe-bf92df44ae37-operator-scripts\") pod \"nova-cell0-db-create-mc6xs\" (UID: \"53f6e27d-9523-4815-9efe-bf92df44ae37\") " pod="openstack/nova-cell0-db-create-mc6xs" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.630469 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt8bp\" (UniqueName: \"kubernetes.io/projected/930d2540-f85c-425f-8750-75ceb6d183b7-kube-api-access-jt8bp\") pod \"nova-api-427c-account-create-update-fzfss\" (UID: \"930d2540-f85c-425f-8750-75ceb6d183b7\") " pod="openstack/nova-api-427c-account-create-update-fzfss" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.630463 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj7zf\" (UniqueName: \"kubernetes.io/projected/53f6e27d-9523-4815-9efe-bf92df44ae37-kube-api-access-sj7zf\") pod \"nova-cell0-db-create-mc6xs\" (UID: \"53f6e27d-9523-4815-9efe-bf92df44ae37\") " pod="openstack/nova-cell0-db-create-mc6xs" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.645010 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-config-data" (OuterVolumeSpecName: "config-data") pod "609e660e-8750-4956-93dc-c99f891c659f" (UID: "609e660e-8750-4956-93dc-c99f891c659f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.647379 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rnqcz" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.649726 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "609e660e-8750-4956-93dc-c99f891c659f" (UID: "609e660e-8750-4956-93dc-c99f891c659f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.666880 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mc6xs" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.697726 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-427c-account-create-update-fzfss" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.711434 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f249265-748f-4a9d-a23c-7bd70a62b669-operator-scripts\") pod \"nova-cell0-3b99-account-create-update-mr8p7\" (UID: \"8f249265-748f-4a9d-a23c-7bd70a62b669\") " pod="openstack/nova-cell0-3b99-account-create-update-mr8p7" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.711490 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5f2t\" (UniqueName: \"kubernetes.io/projected/8f249265-748f-4a9d-a23c-7bd70a62b669-kube-api-access-z5f2t\") pod \"nova-cell0-3b99-account-create-update-mr8p7\" (UID: \"8f249265-748f-4a9d-a23c-7bd70a62b669\") " pod="openstack/nova-cell0-3b99-account-create-update-mr8p7" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.711519 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39-operator-scripts\") pod \"nova-cell1-db-create-fkcb9\" (UID: \"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39\") " pod="openstack/nova-cell1-db-create-fkcb9" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.711542 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhpph\" (UniqueName: \"kubernetes.io/projected/4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39-kube-api-access-qhpph\") pod \"nova-cell1-db-create-fkcb9\" (UID: \"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39\") " pod="openstack/nova-cell1-db-create-fkcb9" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.711720 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.711747 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609e660e-8750-4956-93dc-c99f891c659f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.712068 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39-operator-scripts\") pod \"nova-cell1-db-create-fkcb9\" (UID: \"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39\") " pod="openstack/nova-cell1-db-create-fkcb9" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.773305 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhpph\" (UniqueName: \"kubernetes.io/projected/4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39-kube-api-access-qhpph\") pod \"nova-cell1-db-create-fkcb9\" (UID: \"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39\") " pod="openstack/nova-cell1-db-create-fkcb9" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.779522 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fkcb9" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.790669 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-5379-account-create-update-v8gsj"] Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.795752 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5379-account-create-update-v8gsj" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.797831 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.801998 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5379-account-create-update-v8gsj"] Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.812763 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f249265-748f-4a9d-a23c-7bd70a62b669-operator-scripts\") pod \"nova-cell0-3b99-account-create-update-mr8p7\" (UID: \"8f249265-748f-4a9d-a23c-7bd70a62b669\") " pod="openstack/nova-cell0-3b99-account-create-update-mr8p7" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.812816 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5f2t\" (UniqueName: \"kubernetes.io/projected/8f249265-748f-4a9d-a23c-7bd70a62b669-kube-api-access-z5f2t\") pod \"nova-cell0-3b99-account-create-update-mr8p7\" (UID: \"8f249265-748f-4a9d-a23c-7bd70a62b669\") " pod="openstack/nova-cell0-3b99-account-create-update-mr8p7" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.813712 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f249265-748f-4a9d-a23c-7bd70a62b669-operator-scripts\") pod \"nova-cell0-3b99-account-create-update-mr8p7\" (UID: \"8f249265-748f-4a9d-a23c-7bd70a62b669\") " pod="openstack/nova-cell0-3b99-account-create-update-mr8p7" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.834063 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5f2t\" (UniqueName: \"kubernetes.io/projected/8f249265-748f-4a9d-a23c-7bd70a62b669-kube-api-access-z5f2t\") pod \"nova-cell0-3b99-account-create-update-mr8p7\" (UID: \"8f249265-748f-4a9d-a23c-7bd70a62b669\") " pod="openstack/nova-cell0-3b99-account-create-update-mr8p7" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.919005 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5t5b\" (UniqueName: \"kubernetes.io/projected/e7bd51e4-e60f-457c-b0da-2d08369daf3c-kube-api-access-n5t5b\") pod \"nova-cell1-5379-account-create-update-v8gsj\" (UID: \"e7bd51e4-e60f-457c-b0da-2d08369daf3c\") " pod="openstack/nova-cell1-5379-account-create-update-v8gsj" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.919254 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bd51e4-e60f-457c-b0da-2d08369daf3c-operator-scripts\") pod \"nova-cell1-5379-account-create-update-v8gsj\" (UID: \"e7bd51e4-e60f-457c-b0da-2d08369daf3c\") " pod="openstack/nova-cell1-5379-account-create-update-v8gsj" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.956709 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"609e660e-8750-4956-93dc-c99f891c659f","Type":"ContainerDied","Data":"15d93da7ed069d414f8a4e60882644240d4ac90d75a9b71edb2ae343fa35293a"} Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.956754 4941 scope.go:117] "RemoveContainer" containerID="ca7bfc2e1bf2b29c667ade5852c57a363a583198606dffe5a8480cc5c981945a" Mar 07 07:14:12 crc kubenswrapper[4941]: I0307 07:14:12.956911 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.021440 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5t5b\" (UniqueName: \"kubernetes.io/projected/e7bd51e4-e60f-457c-b0da-2d08369daf3c-kube-api-access-n5t5b\") pod \"nova-cell1-5379-account-create-update-v8gsj\" (UID: \"e7bd51e4-e60f-457c-b0da-2d08369daf3c\") " pod="openstack/nova-cell1-5379-account-create-update-v8gsj" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.021494 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bd51e4-e60f-457c-b0da-2d08369daf3c-operator-scripts\") pod \"nova-cell1-5379-account-create-update-v8gsj\" (UID: \"e7bd51e4-e60f-457c-b0da-2d08369daf3c\") " pod="openstack/nova-cell1-5379-account-create-update-v8gsj" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.022234 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bd51e4-e60f-457c-b0da-2d08369daf3c-operator-scripts\") pod \"nova-cell1-5379-account-create-update-v8gsj\" (UID: \"e7bd51e4-e60f-457c-b0da-2d08369daf3c\") " pod="openstack/nova-cell1-5379-account-create-update-v8gsj" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.026233 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.036629 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.051343 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.053470 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5t5b\" (UniqueName: \"kubernetes.io/projected/e7bd51e4-e60f-457c-b0da-2d08369daf3c-kube-api-access-n5t5b\") pod \"nova-cell1-5379-account-create-update-v8gsj\" (UID: \"e7bd51e4-e60f-457c-b0da-2d08369daf3c\") " pod="openstack/nova-cell1-5379-account-create-update-v8gsj" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.053680 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.056114 4941 scope.go:117] "RemoveContainer" containerID="ae7b9ebd5e5aabacc30e4c60737428ec243f2b66eb09db8c78c52e33bf4b18d3" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.056223 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.056628 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.076323 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.080285 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b99-account-create-update-mr8p7" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.093222 4941 scope.go:117] "RemoveContainer" containerID="6c3739b7f2a5e11c0e104ee916b474288fcc816263b755ea9a372604afda32c6" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.115416 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5379-account-create-update-v8gsj" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.123224 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.123282 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-scripts\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.123303 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.123328 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-run-httpd\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.123477 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrphm\" (UniqueName: \"kubernetes.io/projected/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-kube-api-access-xrphm\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.123688 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-config-data\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.123810 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-log-httpd\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.134028 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rnqcz"] Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.134044 4941 scope.go:117] "RemoveContainer" containerID="78e90976beb39d9ea7b54f5e247200f49f0375f48e5b8dc8df8dc30b49c4d321" Mar 07 07:14:13 crc kubenswrapper[4941]: W0307 07:14:13.150793 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1db1e7c6_8f4d_41ce_bb32_947ce9bfb24a.slice/crio-3f707808beb321a8bf7bd19f7f6a9d95a5df5bc4b89a5c7018f53f342833fea5 WatchSource:0}: Error finding container 3f707808beb321a8bf7bd19f7f6a9d95a5df5bc4b89a5c7018f53f342833fea5: Status 404 returned error can't find the container with id 3f707808beb321a8bf7bd19f7f6a9d95a5df5bc4b89a5c7018f53f342833fea5 Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.225823 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-scripts\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.225875 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.225910 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-run-httpd\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.225948 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrphm\" (UniqueName: \"kubernetes.io/projected/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-kube-api-access-xrphm\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.225996 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-config-data\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.226038 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-log-httpd\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.226606 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.228352 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-run-httpd\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.229039 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-log-httpd\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.232469 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.234914 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.237779 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-scripts\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.239171 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-config-data\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.250374 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrphm\" (UniqueName: \"kubernetes.io/projected/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-kube-api-access-xrphm\") pod \"ceilometer-0\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.369854 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mc6xs"] Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.392266 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.397276 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-427c-account-create-update-fzfss"] Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.515328 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fkcb9"] Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.637060 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3b99-account-create-update-mr8p7"] Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.758867 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5379-account-create-update-v8gsj"] Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.785007 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.978210 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="609e660e-8750-4956-93dc-c99f891c659f" path="/var/lib/kubelet/pods/609e660e-8750-4956-93dc-c99f891c659f/volumes" Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.980103 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5379-account-create-update-v8gsj" event={"ID":"e7bd51e4-e60f-457c-b0da-2d08369daf3c","Type":"ContainerStarted","Data":"66c89cae7a586a3c259279db05a9054ce4efceaa6552c34c64e0608c024538d4"} Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.980150 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b99-account-create-update-mr8p7" event={"ID":"8f249265-748f-4a9d-a23c-7bd70a62b669","Type":"ContainerStarted","Data":"50862a906878b747db4917451854ba4c058c6f6a59c6b20bad1d21d99fb48bfb"} Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.980743 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8","Type":"ContainerStarted","Data":"ba5867ecac147a7e08d84a5a24e3c669d79a3837474fdf32b2aee09abddcc0c5"} Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.986365 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-427c-account-create-update-fzfss" event={"ID":"930d2540-f85c-425f-8750-75ceb6d183b7","Type":"ContainerStarted","Data":"10880c4fde76a453b01e4d08a3f775eca097761a8f888b1dae51eb88484b09cf"} Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.994888 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mc6xs" event={"ID":"53f6e27d-9523-4815-9efe-bf92df44ae37","Type":"ContainerStarted","Data":"f8f7a812693ec4737e88de8877359cbe2542deb142efe695c7e5c2c6e6b81e86"} Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.994932 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mc6xs" event={"ID":"53f6e27d-9523-4815-9efe-bf92df44ae37","Type":"ContainerStarted","Data":"bb22961c4607d6f486f6e50e2cebf98924957986e5d19dae7566bc892853ebf1"} Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.996243 4941 generic.go:334] "Generic (PLEG): container finished" podID="1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a" containerID="bbd647431dce8adc95d3e91bdfb368bc3cc17bc947f541ac45609fc6e7c16e1e" exitCode=0 Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.996328 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rnqcz" event={"ID":"1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a","Type":"ContainerDied","Data":"bbd647431dce8adc95d3e91bdfb368bc3cc17bc947f541ac45609fc6e7c16e1e"} Mar 07 07:14:13 crc kubenswrapper[4941]: I0307 07:14:13.996344 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rnqcz" event={"ID":"1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a","Type":"ContainerStarted","Data":"3f707808beb321a8bf7bd19f7f6a9d95a5df5bc4b89a5c7018f53f342833fea5"} Mar 07 07:14:14 crc kubenswrapper[4941]: I0307 07:14:14.002688 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fkcb9" event={"ID":"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39","Type":"ContainerStarted","Data":"00a40b2d4c9f4455f85c402834216df4b345ea50bf85f4dc496c3575f40cc1f6"} Mar 07 07:14:14 crc kubenswrapper[4941]: I0307 07:14:14.002733 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fkcb9" event={"ID":"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39","Type":"ContainerStarted","Data":"9d4fc9fd5f4bb3575ecaebd4ed9769b641ff82a68f4b2102c1e63db014cf856e"} Mar 07 07:14:14 crc kubenswrapper[4941]: I0307 07:14:14.171345 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-fkcb9" podStartSLOduration=2.171324428 podStartE2EDuration="2.171324428s" podCreationTimestamp="2026-03-07 07:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:14:14.121835782 +0000 UTC m=+1351.074201247" watchObservedRunningTime="2026-03-07 07:14:14.171324428 +0000 UTC m=+1351.123689903" Mar 07 07:14:14 crc kubenswrapper[4941]: I0307 07:14:14.178273 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-mc6xs" podStartSLOduration=2.177586727 podStartE2EDuration="2.177586727s" podCreationTimestamp="2026-03-07 07:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:14:14.136004312 +0000 UTC m=+1351.088369777" watchObservedRunningTime="2026-03-07 07:14:14.177586727 +0000 UTC m=+1351.129952202" Mar 07 07:14:14 crc kubenswrapper[4941]: I0307 07:14:14.320931 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 07 07:14:14 crc kubenswrapper[4941]: I0307 07:14:14.326199 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 07 07:14:14 crc kubenswrapper[4941]: I0307 07:14:14.351564 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 07 07:14:14 crc kubenswrapper[4941]: I0307 07:14:14.364430 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 07 07:14:14 crc kubenswrapper[4941]: E0307 07:14:14.397737 4941 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c05d887_e05c_4593_a5ad_76be76a9e637.slice/crio-ae27b0b0a7d883e7d199f560fdf2575e3aa824f63defc11b15fd15492f8755c3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e5128d9_8fe6_4e85_8c0a_8f4a3a5a7b39.slice/crio-conmon-00a40b2d4c9f4455f85c402834216df4b345ea50bf85f4dc496c3575f40cc1f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7bd51e4_e60f_457c_b0da_2d08369daf3c.slice/crio-7151f2c385ba3425adbc17ab8a8c23b9f8502b8f68d9d7aa6b5c3ae6f0f0ee09.scope\": RecentStats: unable to find data in memory cache]" Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.013091 4941 generic.go:334] "Generic (PLEG): container finished" podID="8f249265-748f-4a9d-a23c-7bd70a62b669" containerID="714d0156b3edde1cf179d33dedfa80bd0b65ef39151a20043484346d70977085" exitCode=0 Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.013303 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b99-account-create-update-mr8p7" event={"ID":"8f249265-748f-4a9d-a23c-7bd70a62b669","Type":"ContainerDied","Data":"714d0156b3edde1cf179d33dedfa80bd0b65ef39151a20043484346d70977085"} Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.015425 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8","Type":"ContainerStarted","Data":"a401b3eb7d0999dfa379e6673f99bca5892a9b9507b42604fae91c94a4b3afda"} Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.016897 4941 generic.go:334] "Generic (PLEG): container finished" podID="930d2540-f85c-425f-8750-75ceb6d183b7" containerID="9b3dcf3e98f5a12c294bf352756527783cac59ff8354db8088af9195cd7d4f5f" exitCode=0 Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.016991 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-427c-account-create-update-fzfss" event={"ID":"930d2540-f85c-425f-8750-75ceb6d183b7","Type":"ContainerDied","Data":"9b3dcf3e98f5a12c294bf352756527783cac59ff8354db8088af9195cd7d4f5f"} Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.018381 4941 generic.go:334] "Generic (PLEG): container finished" podID="53f6e27d-9523-4815-9efe-bf92df44ae37" containerID="f8f7a812693ec4737e88de8877359cbe2542deb142efe695c7e5c2c6e6b81e86" exitCode=0 Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.018456 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mc6xs" event={"ID":"53f6e27d-9523-4815-9efe-bf92df44ae37","Type":"ContainerDied","Data":"f8f7a812693ec4737e88de8877359cbe2542deb142efe695c7e5c2c6e6b81e86"} Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.019895 4941 generic.go:334] "Generic (PLEG): container finished" podID="4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39" containerID="00a40b2d4c9f4455f85c402834216df4b345ea50bf85f4dc496c3575f40cc1f6" exitCode=0 Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.019991 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fkcb9" event={"ID":"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39","Type":"ContainerDied","Data":"00a40b2d4c9f4455f85c402834216df4b345ea50bf85f4dc496c3575f40cc1f6"} Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.024771 4941 generic.go:334] "Generic (PLEG): container finished" podID="e7bd51e4-e60f-457c-b0da-2d08369daf3c" containerID="7151f2c385ba3425adbc17ab8a8c23b9f8502b8f68d9d7aa6b5c3ae6f0f0ee09" exitCode=0 Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.024931 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5379-account-create-update-v8gsj" event={"ID":"e7bd51e4-e60f-457c-b0da-2d08369daf3c","Type":"ContainerDied","Data":"7151f2c385ba3425adbc17ab8a8c23b9f8502b8f68d9d7aa6b5c3ae6f0f0ee09"} Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.025216 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.025260 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.451735 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rnqcz" Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.570628 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a-operator-scripts\") pod \"1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a\" (UID: \"1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a\") " Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.570870 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8whrl\" (UniqueName: \"kubernetes.io/projected/1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a-kube-api-access-8whrl\") pod \"1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a\" (UID: \"1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a\") " Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.571201 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a" (UID: "1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.571630 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.576603 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a-kube-api-access-8whrl" (OuterVolumeSpecName: "kube-api-access-8whrl") pod "1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a" (UID: "1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a"). InnerVolumeSpecName "kube-api-access-8whrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:15 crc kubenswrapper[4941]: I0307 07:14:15.672769 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8whrl\" (UniqueName: \"kubernetes.io/projected/1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a-kube-api-access-8whrl\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.035267 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8","Type":"ContainerStarted","Data":"87b20a82e53d6462136e99709ef1c83cf90134eb26d7db8b8471dabb540e37a8"} Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.035707 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8","Type":"ContainerStarted","Data":"928e36137f19e6aa2579806e89476762e0d9243ad7f467eaaddc793910ae5f56"} Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.036819 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rnqcz" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.036823 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rnqcz" event={"ID":"1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a","Type":"ContainerDied","Data":"3f707808beb321a8bf7bd19f7f6a9d95a5df5bc4b89a5c7018f53f342833fea5"} Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.036859 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f707808beb321a8bf7bd19f7f6a9d95a5df5bc4b89a5c7018f53f342833fea5" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.333271 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.333552 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.371357 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.375032 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5379-account-create-update-v8gsj" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.405358 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.488881 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5t5b\" (UniqueName: \"kubernetes.io/projected/e7bd51e4-e60f-457c-b0da-2d08369daf3c-kube-api-access-n5t5b\") pod \"e7bd51e4-e60f-457c-b0da-2d08369daf3c\" (UID: \"e7bd51e4-e60f-457c-b0da-2d08369daf3c\") " Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.489215 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bd51e4-e60f-457c-b0da-2d08369daf3c-operator-scripts\") pod \"e7bd51e4-e60f-457c-b0da-2d08369daf3c\" (UID: \"e7bd51e4-e60f-457c-b0da-2d08369daf3c\") " Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.496737 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7bd51e4-e60f-457c-b0da-2d08369daf3c-kube-api-access-n5t5b" (OuterVolumeSpecName: "kube-api-access-n5t5b") pod "e7bd51e4-e60f-457c-b0da-2d08369daf3c" (UID: "e7bd51e4-e60f-457c-b0da-2d08369daf3c"). InnerVolumeSpecName "kube-api-access-n5t5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.506221 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7bd51e4-e60f-457c-b0da-2d08369daf3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7bd51e4-e60f-457c-b0da-2d08369daf3c" (UID: "e7bd51e4-e60f-457c-b0da-2d08369daf3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.591586 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bd51e4-e60f-457c-b0da-2d08369daf3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.591625 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5t5b\" (UniqueName: \"kubernetes.io/projected/e7bd51e4-e60f-457c-b0da-2d08369daf3c-kube-api-access-n5t5b\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.725048 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-427c-account-create-update-fzfss" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.732224 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mc6xs" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.741969 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b99-account-create-update-mr8p7" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.749046 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fkcb9" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.895701 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhpph\" (UniqueName: \"kubernetes.io/projected/4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39-kube-api-access-qhpph\") pod \"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39\" (UID: \"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39\") " Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.895754 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39-operator-scripts\") pod \"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39\" (UID: \"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39\") " Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.895792 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f249265-748f-4a9d-a23c-7bd70a62b669-operator-scripts\") pod \"8f249265-748f-4a9d-a23c-7bd70a62b669\" (UID: \"8f249265-748f-4a9d-a23c-7bd70a62b669\") " Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.895915 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj7zf\" (UniqueName: \"kubernetes.io/projected/53f6e27d-9523-4815-9efe-bf92df44ae37-kube-api-access-sj7zf\") pod \"53f6e27d-9523-4815-9efe-bf92df44ae37\" (UID: \"53f6e27d-9523-4815-9efe-bf92df44ae37\") " Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.895985 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5f2t\" (UniqueName: \"kubernetes.io/projected/8f249265-748f-4a9d-a23c-7bd70a62b669-kube-api-access-z5f2t\") pod \"8f249265-748f-4a9d-a23c-7bd70a62b669\" (UID: \"8f249265-748f-4a9d-a23c-7bd70a62b669\") " Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.896012 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt8bp\" (UniqueName: \"kubernetes.io/projected/930d2540-f85c-425f-8750-75ceb6d183b7-kube-api-access-jt8bp\") pod \"930d2540-f85c-425f-8750-75ceb6d183b7\" (UID: \"930d2540-f85c-425f-8750-75ceb6d183b7\") " Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.896058 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53f6e27d-9523-4815-9efe-bf92df44ae37-operator-scripts\") pod \"53f6e27d-9523-4815-9efe-bf92df44ae37\" (UID: \"53f6e27d-9523-4815-9efe-bf92df44ae37\") " Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.896083 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/930d2540-f85c-425f-8750-75ceb6d183b7-operator-scripts\") pod \"930d2540-f85c-425f-8750-75ceb6d183b7\" (UID: \"930d2540-f85c-425f-8750-75ceb6d183b7\") " Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.896593 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39" (UID: "4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.896742 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53f6e27d-9523-4815-9efe-bf92df44ae37-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53f6e27d-9523-4815-9efe-bf92df44ae37" (UID: "53f6e27d-9523-4815-9efe-bf92df44ae37"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.896768 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f249265-748f-4a9d-a23c-7bd70a62b669-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f249265-748f-4a9d-a23c-7bd70a62b669" (UID: "8f249265-748f-4a9d-a23c-7bd70a62b669"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.896754 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/930d2540-f85c-425f-8750-75ceb6d183b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "930d2540-f85c-425f-8750-75ceb6d183b7" (UID: "930d2540-f85c-425f-8750-75ceb6d183b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.897059 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/930d2540-f85c-425f-8750-75ceb6d183b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.897081 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53f6e27d-9523-4815-9efe-bf92df44ae37-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.897090 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.897157 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f249265-748f-4a9d-a23c-7bd70a62b669-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.900608 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39-kube-api-access-qhpph" (OuterVolumeSpecName: "kube-api-access-qhpph") pod "4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39" (UID: "4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39"). InnerVolumeSpecName "kube-api-access-qhpph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.900659 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f6e27d-9523-4815-9efe-bf92df44ae37-kube-api-access-sj7zf" (OuterVolumeSpecName: "kube-api-access-sj7zf") pod "53f6e27d-9523-4815-9efe-bf92df44ae37" (UID: "53f6e27d-9523-4815-9efe-bf92df44ae37"). InnerVolumeSpecName "kube-api-access-sj7zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.902390 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/930d2540-f85c-425f-8750-75ceb6d183b7-kube-api-access-jt8bp" (OuterVolumeSpecName: "kube-api-access-jt8bp") pod "930d2540-f85c-425f-8750-75ceb6d183b7" (UID: "930d2540-f85c-425f-8750-75ceb6d183b7"). InnerVolumeSpecName "kube-api-access-jt8bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.905024 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f249265-748f-4a9d-a23c-7bd70a62b669-kube-api-access-z5f2t" (OuterVolumeSpecName: "kube-api-access-z5f2t") pod "8f249265-748f-4a9d-a23c-7bd70a62b669" (UID: "8f249265-748f-4a9d-a23c-7bd70a62b669"). InnerVolumeSpecName "kube-api-access-z5f2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.956525 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.998966 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5f2t\" (UniqueName: \"kubernetes.io/projected/8f249265-748f-4a9d-a23c-7bd70a62b669-kube-api-access-z5f2t\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.998999 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt8bp\" (UniqueName: \"kubernetes.io/projected/930d2540-f85c-425f-8750-75ceb6d183b7-kube-api-access-jt8bp\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.999010 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhpph\" (UniqueName: \"kubernetes.io/projected/4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39-kube-api-access-qhpph\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:16 crc kubenswrapper[4941]: I0307 07:14:16.999019 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj7zf\" (UniqueName: \"kubernetes.io/projected/53f6e27d-9523-4815-9efe-bf92df44ae37-kube-api-access-sj7zf\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.045631 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mc6xs" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.045656 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mc6xs" event={"ID":"53f6e27d-9523-4815-9efe-bf92df44ae37","Type":"ContainerDied","Data":"bb22961c4607d6f486f6e50e2cebf98924957986e5d19dae7566bc892853ebf1"} Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.045698 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb22961c4607d6f486f6e50e2cebf98924957986e5d19dae7566bc892853ebf1" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.047302 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fkcb9" event={"ID":"4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39","Type":"ContainerDied","Data":"9d4fc9fd5f4bb3575ecaebd4ed9769b641ff82a68f4b2102c1e63db014cf856e"} Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.047326 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fkcb9" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.047330 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d4fc9fd5f4bb3575ecaebd4ed9769b641ff82a68f4b2102c1e63db014cf856e" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.049293 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5379-account-create-update-v8gsj" event={"ID":"e7bd51e4-e60f-457c-b0da-2d08369daf3c","Type":"ContainerDied","Data":"66c89cae7a586a3c259279db05a9054ce4efceaa6552c34c64e0608c024538d4"} Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.049330 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66c89cae7a586a3c259279db05a9054ce4efceaa6552c34c64e0608c024538d4" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.049551 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5379-account-create-update-v8gsj" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.050943 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b99-account-create-update-mr8p7" event={"ID":"8f249265-748f-4a9d-a23c-7bd70a62b669","Type":"ContainerDied","Data":"50862a906878b747db4917451854ba4c058c6f6a59c6b20bad1d21d99fb48bfb"} Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.050968 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50862a906878b747db4917451854ba4c058c6f6a59c6b20bad1d21d99fb48bfb" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.051013 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b99-account-create-update-mr8p7" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.055965 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-427c-account-create-update-fzfss" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.056898 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-427c-account-create-update-fzfss" event={"ID":"930d2540-f85c-425f-8750-75ceb6d183b7","Type":"ContainerDied","Data":"10880c4fde76a453b01e4d08a3f775eca097761a8f888b1dae51eb88484b09cf"} Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.056927 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10880c4fde76a453b01e4d08a3f775eca097761a8f888b1dae51eb88484b09cf" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.056947 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.056959 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.492131 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.492241 4941 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 07:14:17 crc kubenswrapper[4941]: I0307 07:14:17.502716 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 07 07:14:18 crc kubenswrapper[4941]: I0307 07:14:18.065062 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8","Type":"ContainerStarted","Data":"c679341b4957dd5b44cc12ff6e2f7e2e4958cef33ef5dc423306a62c3ab70623"} Mar 07 07:14:18 crc kubenswrapper[4941]: I0307 07:14:18.065172 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="ceilometer-central-agent" containerID="cri-o://a401b3eb7d0999dfa379e6673f99bca5892a9b9507b42604fae91c94a4b3afda" gracePeriod=30 Mar 07 07:14:18 crc kubenswrapper[4941]: I0307 07:14:18.065220 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="proxy-httpd" containerID="cri-o://c679341b4957dd5b44cc12ff6e2f7e2e4958cef33ef5dc423306a62c3ab70623" gracePeriod=30 Mar 07 07:14:18 crc kubenswrapper[4941]: I0307 07:14:18.065237 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="ceilometer-notification-agent" containerID="cri-o://928e36137f19e6aa2579806e89476762e0d9243ad7f467eaaddc793910ae5f56" gracePeriod=30 Mar 07 07:14:18 crc kubenswrapper[4941]: I0307 07:14:18.065263 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="sg-core" containerID="cri-o://87b20a82e53d6462136e99709ef1c83cf90134eb26d7db8b8471dabb540e37a8" gracePeriod=30 Mar 07 07:14:18 crc kubenswrapper[4941]: I0307 07:14:18.065595 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 07 07:14:18 crc kubenswrapper[4941]: I0307 07:14:18.087710 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.485676966 podStartE2EDuration="5.087692069s" podCreationTimestamp="2026-03-07 07:14:13 +0000 UTC" firstStartedPulling="2026-03-07 07:14:13.80814705 +0000 UTC m=+1350.760512515" lastFinishedPulling="2026-03-07 07:14:17.410162153 +0000 UTC m=+1354.362527618" observedRunningTime="2026-03-07 07:14:18.08259444 +0000 UTC m=+1355.034959905" watchObservedRunningTime="2026-03-07 07:14:18.087692069 +0000 UTC m=+1355.040057534" Mar 07 07:14:19 crc kubenswrapper[4941]: I0307 07:14:19.107377 4941 generic.go:334] "Generic (PLEG): container finished" podID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerID="c679341b4957dd5b44cc12ff6e2f7e2e4958cef33ef5dc423306a62c3ab70623" exitCode=0 Mar 07 07:14:19 crc kubenswrapper[4941]: I0307 07:14:19.107729 4941 generic.go:334] "Generic (PLEG): container finished" podID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerID="87b20a82e53d6462136e99709ef1c83cf90134eb26d7db8b8471dabb540e37a8" exitCode=2 Mar 07 07:14:19 crc kubenswrapper[4941]: I0307 07:14:19.107738 4941 generic.go:334] "Generic (PLEG): container finished" podID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerID="928e36137f19e6aa2579806e89476762e0d9243ad7f467eaaddc793910ae5f56" exitCode=0 Mar 07 07:14:19 crc kubenswrapper[4941]: I0307 07:14:19.108685 4941 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 07:14:19 crc kubenswrapper[4941]: I0307 07:14:19.108697 4941 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 07:14:19 crc kubenswrapper[4941]: I0307 07:14:19.108675 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8","Type":"ContainerDied","Data":"c679341b4957dd5b44cc12ff6e2f7e2e4958cef33ef5dc423306a62c3ab70623"} Mar 07 07:14:19 crc kubenswrapper[4941]: I0307 07:14:19.108744 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8","Type":"ContainerDied","Data":"87b20a82e53d6462136e99709ef1c83cf90134eb26d7db8b8471dabb540e37a8"} Mar 07 07:14:19 crc kubenswrapper[4941]: I0307 07:14:19.108758 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8","Type":"ContainerDied","Data":"928e36137f19e6aa2579806e89476762e0d9243ad7f467eaaddc793910ae5f56"} Mar 07 07:14:19 crc kubenswrapper[4941]: I0307 07:14:19.362372 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 07 07:14:19 crc kubenswrapper[4941]: I0307 07:14:19.363843 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.821367 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8lhkg"] Mar 07 07:14:22 crc kubenswrapper[4941]: E0307 07:14:22.821717 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f249265-748f-4a9d-a23c-7bd70a62b669" containerName="mariadb-account-create-update" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.821728 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f249265-748f-4a9d-a23c-7bd70a62b669" containerName="mariadb-account-create-update" Mar 07 07:14:22 crc kubenswrapper[4941]: E0307 07:14:22.821744 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930d2540-f85c-425f-8750-75ceb6d183b7" containerName="mariadb-account-create-update" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.821750 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="930d2540-f85c-425f-8750-75ceb6d183b7" containerName="mariadb-account-create-update" Mar 07 07:14:22 crc kubenswrapper[4941]: E0307 07:14:22.821761 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bd51e4-e60f-457c-b0da-2d08369daf3c" containerName="mariadb-account-create-update" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.821767 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bd51e4-e60f-457c-b0da-2d08369daf3c" containerName="mariadb-account-create-update" Mar 07 07:14:22 crc kubenswrapper[4941]: E0307 07:14:22.821778 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f6e27d-9523-4815-9efe-bf92df44ae37" containerName="mariadb-database-create" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.821783 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f6e27d-9523-4815-9efe-bf92df44ae37" containerName="mariadb-database-create" Mar 07 07:14:22 crc kubenswrapper[4941]: E0307 07:14:22.821806 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a" containerName="mariadb-database-create" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.821811 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a" containerName="mariadb-database-create" Mar 07 07:14:22 crc kubenswrapper[4941]: E0307 07:14:22.821830 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39" containerName="mariadb-database-create" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.821835 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39" containerName="mariadb-database-create" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.821979 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="930d2540-f85c-425f-8750-75ceb6d183b7" containerName="mariadb-account-create-update" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.821991 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a" containerName="mariadb-database-create" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.822001 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f249265-748f-4a9d-a23c-7bd70a62b669" containerName="mariadb-account-create-update" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.822009 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39" containerName="mariadb-database-create" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.822018 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7bd51e4-e60f-457c-b0da-2d08369daf3c" containerName="mariadb-account-create-update" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.822028 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f6e27d-9523-4815-9efe-bf92df44ae37" containerName="mariadb-database-create" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.823079 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.834142 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.834157 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bv2tx" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.839357 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.844592 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8lhkg"] Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.906128 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-config-data\") pod \"nova-cell0-conductor-db-sync-8lhkg\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.906173 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pwv2\" (UniqueName: \"kubernetes.io/projected/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-kube-api-access-4pwv2\") pod \"nova-cell0-conductor-db-sync-8lhkg\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.906201 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-scripts\") pod \"nova-cell0-conductor-db-sync-8lhkg\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:22 crc kubenswrapper[4941]: I0307 07:14:22.906312 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8lhkg\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:23 crc kubenswrapper[4941]: I0307 07:14:23.007583 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8lhkg\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:23 crc kubenswrapper[4941]: I0307 07:14:23.007654 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-config-data\") pod \"nova-cell0-conductor-db-sync-8lhkg\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:23 crc kubenswrapper[4941]: I0307 07:14:23.007677 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pwv2\" (UniqueName: \"kubernetes.io/projected/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-kube-api-access-4pwv2\") pod \"nova-cell0-conductor-db-sync-8lhkg\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:23 crc kubenswrapper[4941]: I0307 07:14:23.007703 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-scripts\") pod \"nova-cell0-conductor-db-sync-8lhkg\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:23 crc kubenswrapper[4941]: I0307 07:14:23.013322 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-config-data\") pod \"nova-cell0-conductor-db-sync-8lhkg\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:23 crc kubenswrapper[4941]: I0307 07:14:23.013507 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-scripts\") pod \"nova-cell0-conductor-db-sync-8lhkg\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:23 crc kubenswrapper[4941]: I0307 07:14:23.013789 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8lhkg\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:23 crc kubenswrapper[4941]: I0307 07:14:23.026227 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pwv2\" (UniqueName: \"kubernetes.io/projected/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-kube-api-access-4pwv2\") pod \"nova-cell0-conductor-db-sync-8lhkg\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:23 crc kubenswrapper[4941]: I0307 07:14:23.143195 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:23 crc kubenswrapper[4941]: I0307 07:14:23.608116 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8lhkg"] Mar 07 07:14:23 crc kubenswrapper[4941]: W0307 07:14:23.615367 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ceb33a6_9365_45ba_99a7_db9a11b3e7ca.slice/crio-64b8c332c33785df9df5ab43816e7237c49b86cabae3836824864707c5684a45 WatchSource:0}: Error finding container 64b8c332c33785df9df5ab43816e7237c49b86cabae3836824864707c5684a45: Status 404 returned error can't find the container with id 64b8c332c33785df9df5ab43816e7237c49b86cabae3836824864707c5684a45 Mar 07 07:14:23 crc kubenswrapper[4941]: I0307 07:14:23.876581 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.023628 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-run-httpd\") pod \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.024182 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrphm\" (UniqueName: \"kubernetes.io/projected/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-kube-api-access-xrphm\") pod \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.024280 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-scripts\") pod \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.024369 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-combined-ca-bundle\") pod \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.024298 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" (UID: "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.024659 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-sg-core-conf-yaml\") pod \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.024782 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-log-httpd\") pod \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.024876 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-config-data\") pod \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\" (UID: \"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8\") " Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.025578 4941 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.026512 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" (UID: "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.030184 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-kube-api-access-xrphm" (OuterVolumeSpecName: "kube-api-access-xrphm") pod "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" (UID: "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8"). InnerVolumeSpecName "kube-api-access-xrphm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.030587 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-scripts" (OuterVolumeSpecName: "scripts") pod "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" (UID: "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.050916 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" (UID: "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.103543 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" (UID: "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.111477 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-config-data" (OuterVolumeSpecName: "config-data") pod "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" (UID: "f7ceea3b-8740-424a-a261-f0ecb5e0a5b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.127515 4941 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.127740 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.127828 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrphm\" (UniqueName: \"kubernetes.io/projected/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-kube-api-access-xrphm\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.127911 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.127982 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.128058 4941 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.160851 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8lhkg" event={"ID":"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca","Type":"ContainerStarted","Data":"64b8c332c33785df9df5ab43816e7237c49b86cabae3836824864707c5684a45"} Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.163356 4941 generic.go:334] "Generic (PLEG): container finished" podID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerID="a401b3eb7d0999dfa379e6673f99bca5892a9b9507b42604fae91c94a4b3afda" exitCode=0 Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.163385 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8","Type":"ContainerDied","Data":"a401b3eb7d0999dfa379e6673f99bca5892a9b9507b42604fae91c94a4b3afda"} Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.163405 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ceea3b-8740-424a-a261-f0ecb5e0a5b8","Type":"ContainerDied","Data":"ba5867ecac147a7e08d84a5a24e3c669d79a3837474fdf32b2aee09abddcc0c5"} Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.163453 4941 scope.go:117] "RemoveContainer" containerID="c679341b4957dd5b44cc12ff6e2f7e2e4958cef33ef5dc423306a62c3ab70623" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.163567 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.204737 4941 scope.go:117] "RemoveContainer" containerID="87b20a82e53d6462136e99709ef1c83cf90134eb26d7db8b8471dabb540e37a8" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.209782 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.219040 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.231356 4941 scope.go:117] "RemoveContainer" containerID="928e36137f19e6aa2579806e89476762e0d9243ad7f467eaaddc793910ae5f56" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.249113 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:14:24 crc kubenswrapper[4941]: E0307 07:14:24.249711 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="proxy-httpd" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.250015 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="proxy-httpd" Mar 07 07:14:24 crc kubenswrapper[4941]: E0307 07:14:24.250112 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="ceilometer-notification-agent" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.250170 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="ceilometer-notification-agent" Mar 07 07:14:24 crc kubenswrapper[4941]: E0307 07:14:24.250256 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="ceilometer-central-agent" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.250311 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="ceilometer-central-agent" Mar 07 07:14:24 crc kubenswrapper[4941]: E0307 07:14:24.250371 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="sg-core" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.250449 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="sg-core" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.250697 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="ceilometer-central-agent" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.250761 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="proxy-httpd" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.250819 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="sg-core" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.250882 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" containerName="ceilometer-notification-agent" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.252448 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.256943 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.258617 4941 scope.go:117] "RemoveContainer" containerID="a401b3eb7d0999dfa379e6673f99bca5892a9b9507b42604fae91c94a4b3afda" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.276668 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.276856 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.326089 4941 scope.go:117] "RemoveContainer" containerID="c679341b4957dd5b44cc12ff6e2f7e2e4958cef33ef5dc423306a62c3ab70623" Mar 07 07:14:24 crc kubenswrapper[4941]: E0307 07:14:24.326658 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c679341b4957dd5b44cc12ff6e2f7e2e4958cef33ef5dc423306a62c3ab70623\": container with ID starting with c679341b4957dd5b44cc12ff6e2f7e2e4958cef33ef5dc423306a62c3ab70623 not found: ID does not exist" containerID="c679341b4957dd5b44cc12ff6e2f7e2e4958cef33ef5dc423306a62c3ab70623" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.326688 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c679341b4957dd5b44cc12ff6e2f7e2e4958cef33ef5dc423306a62c3ab70623"} err="failed to get container status \"c679341b4957dd5b44cc12ff6e2f7e2e4958cef33ef5dc423306a62c3ab70623\": rpc error: code = NotFound desc = could not find container \"c679341b4957dd5b44cc12ff6e2f7e2e4958cef33ef5dc423306a62c3ab70623\": container with ID starting with c679341b4957dd5b44cc12ff6e2f7e2e4958cef33ef5dc423306a62c3ab70623 not found: ID does not exist" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.326708 4941 scope.go:117] "RemoveContainer" containerID="87b20a82e53d6462136e99709ef1c83cf90134eb26d7db8b8471dabb540e37a8" Mar 07 07:14:24 crc kubenswrapper[4941]: E0307 07:14:24.327093 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b20a82e53d6462136e99709ef1c83cf90134eb26d7db8b8471dabb540e37a8\": container with ID starting with 87b20a82e53d6462136e99709ef1c83cf90134eb26d7db8b8471dabb540e37a8 not found: ID does not exist" containerID="87b20a82e53d6462136e99709ef1c83cf90134eb26d7db8b8471dabb540e37a8" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.327124 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b20a82e53d6462136e99709ef1c83cf90134eb26d7db8b8471dabb540e37a8"} err="failed to get container status \"87b20a82e53d6462136e99709ef1c83cf90134eb26d7db8b8471dabb540e37a8\": rpc error: code = NotFound desc = could not find container \"87b20a82e53d6462136e99709ef1c83cf90134eb26d7db8b8471dabb540e37a8\": container with ID starting with 87b20a82e53d6462136e99709ef1c83cf90134eb26d7db8b8471dabb540e37a8 not found: ID does not exist" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.327142 4941 scope.go:117] "RemoveContainer" containerID="928e36137f19e6aa2579806e89476762e0d9243ad7f467eaaddc793910ae5f56" Mar 07 07:14:24 crc kubenswrapper[4941]: E0307 07:14:24.327555 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"928e36137f19e6aa2579806e89476762e0d9243ad7f467eaaddc793910ae5f56\": container with ID starting with 928e36137f19e6aa2579806e89476762e0d9243ad7f467eaaddc793910ae5f56 not found: ID does not exist" containerID="928e36137f19e6aa2579806e89476762e0d9243ad7f467eaaddc793910ae5f56" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.327607 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"928e36137f19e6aa2579806e89476762e0d9243ad7f467eaaddc793910ae5f56"} err="failed to get container status \"928e36137f19e6aa2579806e89476762e0d9243ad7f467eaaddc793910ae5f56\": rpc error: code = NotFound desc = could not find container \"928e36137f19e6aa2579806e89476762e0d9243ad7f467eaaddc793910ae5f56\": container with ID starting with 928e36137f19e6aa2579806e89476762e0d9243ad7f467eaaddc793910ae5f56 not found: ID does not exist" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.327642 4941 scope.go:117] "RemoveContainer" containerID="a401b3eb7d0999dfa379e6673f99bca5892a9b9507b42604fae91c94a4b3afda" Mar 07 07:14:24 crc kubenswrapper[4941]: E0307 07:14:24.328150 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a401b3eb7d0999dfa379e6673f99bca5892a9b9507b42604fae91c94a4b3afda\": container with ID starting with a401b3eb7d0999dfa379e6673f99bca5892a9b9507b42604fae91c94a4b3afda not found: ID does not exist" containerID="a401b3eb7d0999dfa379e6673f99bca5892a9b9507b42604fae91c94a4b3afda" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.328174 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a401b3eb7d0999dfa379e6673f99bca5892a9b9507b42604fae91c94a4b3afda"} err="failed to get container status \"a401b3eb7d0999dfa379e6673f99bca5892a9b9507b42604fae91c94a4b3afda\": rpc error: code = NotFound desc = could not find container \"a401b3eb7d0999dfa379e6673f99bca5892a9b9507b42604fae91c94a4b3afda\": container with ID starting with a401b3eb7d0999dfa379e6673f99bca5892a9b9507b42604fae91c94a4b3afda not found: ID does not exist" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.331167 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.331215 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-config-data\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.331234 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-scripts\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.331260 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.331287 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn65p\" (UniqueName: \"kubernetes.io/projected/2ab97627-6b7f-4984-af5f-a732f13b2486-kube-api-access-kn65p\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.331311 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ab97627-6b7f-4984-af5f-a732f13b2486-run-httpd\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.331352 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ab97627-6b7f-4984-af5f-a732f13b2486-log-httpd\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.433418 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.433533 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn65p\" (UniqueName: \"kubernetes.io/projected/2ab97627-6b7f-4984-af5f-a732f13b2486-kube-api-access-kn65p\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.435793 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ab97627-6b7f-4984-af5f-a732f13b2486-run-httpd\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.435892 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ab97627-6b7f-4984-af5f-a732f13b2486-log-httpd\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.436028 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.436677 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ab97627-6b7f-4984-af5f-a732f13b2486-run-httpd\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.436958 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ab97627-6b7f-4984-af5f-a732f13b2486-log-httpd\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.437969 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.439922 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.442386 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-config-data\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.442477 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-scripts\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.447214 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-scripts\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.448369 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-config-data\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.465531 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn65p\" (UniqueName: \"kubernetes.io/projected/2ab97627-6b7f-4984-af5f-a732f13b2486-kube-api-access-kn65p\") pod \"ceilometer-0\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: I0307 07:14:24.600805 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:14:24 crc kubenswrapper[4941]: E0307 07:14:24.672989 4941 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c05d887_e05c_4593_a5ad_76be76a9e637.slice/crio-ae27b0b0a7d883e7d199f560fdf2575e3aa824f63defc11b15fd15492f8755c3\": RecentStats: unable to find data in memory cache]" Mar 07 07:14:25 crc kubenswrapper[4941]: I0307 07:14:25.025715 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:14:25 crc kubenswrapper[4941]: W0307 07:14:25.037433 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ab97627_6b7f_4984_af5f_a732f13b2486.slice/crio-accece7c95c66a08119647336a9c3a2bf1fa2a3706527a1b2e0a8b33c272492d WatchSource:0}: Error finding container accece7c95c66a08119647336a9c3a2bf1fa2a3706527a1b2e0a8b33c272492d: Status 404 returned error can't find the container with id accece7c95c66a08119647336a9c3a2bf1fa2a3706527a1b2e0a8b33c272492d Mar 07 07:14:25 crc kubenswrapper[4941]: I0307 07:14:25.193008 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ab97627-6b7f-4984-af5f-a732f13b2486","Type":"ContainerStarted","Data":"accece7c95c66a08119647336a9c3a2bf1fa2a3706527a1b2e0a8b33c272492d"} Mar 07 07:14:25 crc kubenswrapper[4941]: I0307 07:14:25.966333 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ceea3b-8740-424a-a261-f0ecb5e0a5b8" path="/var/lib/kubelet/pods/f7ceea3b-8740-424a-a261-f0ecb5e0a5b8/volumes" Mar 07 07:14:26 crc kubenswrapper[4941]: I0307 07:14:26.204713 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ab97627-6b7f-4984-af5f-a732f13b2486","Type":"ContainerStarted","Data":"ea3be111ebf6ccdd8344a06aaac53960cb585205448af6727079dd15cb825d66"} Mar 07 07:14:32 crc kubenswrapper[4941]: I0307 07:14:32.262083 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8lhkg" event={"ID":"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca","Type":"ContainerStarted","Data":"476fb5c632c42dcf296ba56237913854497fa76de1766c6d57e69b4168f8a311"} Mar 07 07:14:32 crc kubenswrapper[4941]: I0307 07:14:32.263993 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ab97627-6b7f-4984-af5f-a732f13b2486","Type":"ContainerStarted","Data":"4644b7abc8347fbcc44accca2ab2f75d488b41012a2fc743e0f3d6b3860cf462"} Mar 07 07:14:33 crc kubenswrapper[4941]: I0307 07:14:33.099391 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-8lhkg" podStartSLOduration=3.164466155 podStartE2EDuration="11.099367901s" podCreationTimestamp="2026-03-07 07:14:22 +0000 UTC" firstStartedPulling="2026-03-07 07:14:23.619603845 +0000 UTC m=+1360.571969310" lastFinishedPulling="2026-03-07 07:14:31.554505591 +0000 UTC m=+1368.506871056" observedRunningTime="2026-03-07 07:14:32.286781947 +0000 UTC m=+1369.239147412" watchObservedRunningTime="2026-03-07 07:14:33.099367901 +0000 UTC m=+1370.051733376" Mar 07 07:14:33 crc kubenswrapper[4941]: I0307 07:14:33.119690 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:14:33 crc kubenswrapper[4941]: I0307 07:14:33.275522 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ab97627-6b7f-4984-af5f-a732f13b2486","Type":"ContainerStarted","Data":"f29ef3767e472ba307a6f90a5ede0e8a49f38ed37e729ca8db1ef166df3f9e5c"} Mar 07 07:14:34 crc kubenswrapper[4941]: I0307 07:14:34.288978 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ab97627-6b7f-4984-af5f-a732f13b2486","Type":"ContainerStarted","Data":"c58e5b9c34ddbe30187350ca2343ceb9e751aeabfefecaacc487d2225fa6920e"} Mar 07 07:14:34 crc kubenswrapper[4941]: I0307 07:14:34.289311 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="sg-core" containerID="cri-o://f29ef3767e472ba307a6f90a5ede0e8a49f38ed37e729ca8db1ef166df3f9e5c" gracePeriod=30 Mar 07 07:14:34 crc kubenswrapper[4941]: I0307 07:14:34.289346 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="ceilometer-notification-agent" containerID="cri-o://4644b7abc8347fbcc44accca2ab2f75d488b41012a2fc743e0f3d6b3860cf462" gracePeriod=30 Mar 07 07:14:34 crc kubenswrapper[4941]: I0307 07:14:34.289246 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="ceilometer-central-agent" containerID="cri-o://ea3be111ebf6ccdd8344a06aaac53960cb585205448af6727079dd15cb825d66" gracePeriod=30 Mar 07 07:14:34 crc kubenswrapper[4941]: I0307 07:14:34.289320 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="proxy-httpd" containerID="cri-o://c58e5b9c34ddbe30187350ca2343ceb9e751aeabfefecaacc487d2225fa6920e" gracePeriod=30 Mar 07 07:14:34 crc kubenswrapper[4941]: I0307 07:14:34.289549 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 07 07:14:34 crc kubenswrapper[4941]: I0307 07:14:34.325025 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.371594915 podStartE2EDuration="10.325003742s" podCreationTimestamp="2026-03-07 07:14:24 +0000 UTC" firstStartedPulling="2026-03-07 07:14:25.0402356 +0000 UTC m=+1361.992601075" lastFinishedPulling="2026-03-07 07:14:33.993644397 +0000 UTC m=+1370.946009902" observedRunningTime="2026-03-07 07:14:34.310952827 +0000 UTC m=+1371.263318312" watchObservedRunningTime="2026-03-07 07:14:34.325003742 +0000 UTC m=+1371.277369217" Mar 07 07:14:34 crc kubenswrapper[4941]: E0307 07:14:34.884573 4941 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c05d887_e05c_4593_a5ad_76be76a9e637.slice/crio-ae27b0b0a7d883e7d199f560fdf2575e3aa824f63defc11b15fd15492f8755c3\": RecentStats: unable to find data in memory cache]" Mar 07 07:14:35 crc kubenswrapper[4941]: I0307 07:14:35.301799 4941 generic.go:334] "Generic (PLEG): container finished" podID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerID="f29ef3767e472ba307a6f90a5ede0e8a49f38ed37e729ca8db1ef166df3f9e5c" exitCode=2 Mar 07 07:14:35 crc kubenswrapper[4941]: I0307 07:14:35.301843 4941 generic.go:334] "Generic (PLEG): container finished" podID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerID="4644b7abc8347fbcc44accca2ab2f75d488b41012a2fc743e0f3d6b3860cf462" exitCode=0 Mar 07 07:14:35 crc kubenswrapper[4941]: I0307 07:14:35.301853 4941 generic.go:334] "Generic (PLEG): container finished" podID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerID="ea3be111ebf6ccdd8344a06aaac53960cb585205448af6727079dd15cb825d66" exitCode=0 Mar 07 07:14:35 crc kubenswrapper[4941]: I0307 07:14:35.301874 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ab97627-6b7f-4984-af5f-a732f13b2486","Type":"ContainerDied","Data":"f29ef3767e472ba307a6f90a5ede0e8a49f38ed37e729ca8db1ef166df3f9e5c"} Mar 07 07:14:35 crc kubenswrapper[4941]: I0307 07:14:35.301903 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ab97627-6b7f-4984-af5f-a732f13b2486","Type":"ContainerDied","Data":"4644b7abc8347fbcc44accca2ab2f75d488b41012a2fc743e0f3d6b3860cf462"} Mar 07 07:14:35 crc kubenswrapper[4941]: I0307 07:14:35.301916 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ab97627-6b7f-4984-af5f-a732f13b2486","Type":"ContainerDied","Data":"ea3be111ebf6ccdd8344a06aaac53960cb585205448af6727079dd15cb825d66"} Mar 07 07:14:40 crc kubenswrapper[4941]: I0307 07:14:40.314003 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:14:40 crc kubenswrapper[4941]: I0307 07:14:40.314608 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:14:41 crc kubenswrapper[4941]: I0307 07:14:41.362572 4941 generic.go:334] "Generic (PLEG): container finished" podID="7ceb33a6-9365-45ba-99a7-db9a11b3e7ca" containerID="476fb5c632c42dcf296ba56237913854497fa76de1766c6d57e69b4168f8a311" exitCode=0 Mar 07 07:14:41 crc kubenswrapper[4941]: I0307 07:14:41.362684 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8lhkg" event={"ID":"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca","Type":"ContainerDied","Data":"476fb5c632c42dcf296ba56237913854497fa76de1766c6d57e69b4168f8a311"} Mar 07 07:14:42 crc kubenswrapper[4941]: I0307 07:14:42.788912 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:42 crc kubenswrapper[4941]: I0307 07:14:42.898173 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-combined-ca-bundle\") pod \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " Mar 07 07:14:42 crc kubenswrapper[4941]: I0307 07:14:42.898278 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-scripts\") pod \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " Mar 07 07:14:42 crc kubenswrapper[4941]: I0307 07:14:42.898534 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pwv2\" (UniqueName: \"kubernetes.io/projected/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-kube-api-access-4pwv2\") pod \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " Mar 07 07:14:42 crc kubenswrapper[4941]: I0307 07:14:42.898727 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-config-data\") pod \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\" (UID: \"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca\") " Mar 07 07:14:42 crc kubenswrapper[4941]: I0307 07:14:42.903695 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-scripts" (OuterVolumeSpecName: "scripts") pod "7ceb33a6-9365-45ba-99a7-db9a11b3e7ca" (UID: "7ceb33a6-9365-45ba-99a7-db9a11b3e7ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:42 crc kubenswrapper[4941]: I0307 07:14:42.903706 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-kube-api-access-4pwv2" (OuterVolumeSpecName: "kube-api-access-4pwv2") pod "7ceb33a6-9365-45ba-99a7-db9a11b3e7ca" (UID: "7ceb33a6-9365-45ba-99a7-db9a11b3e7ca"). InnerVolumeSpecName "kube-api-access-4pwv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:42 crc kubenswrapper[4941]: I0307 07:14:42.924215 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-config-data" (OuterVolumeSpecName: "config-data") pod "7ceb33a6-9365-45ba-99a7-db9a11b3e7ca" (UID: "7ceb33a6-9365-45ba-99a7-db9a11b3e7ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:42 crc kubenswrapper[4941]: I0307 07:14:42.950677 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ceb33a6-9365-45ba-99a7-db9a11b3e7ca" (UID: "7ceb33a6-9365-45ba-99a7-db9a11b3e7ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.002470 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.002517 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.002538 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.002555 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pwv2\" (UniqueName: \"kubernetes.io/projected/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca-kube-api-access-4pwv2\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.409970 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8lhkg" event={"ID":"7ceb33a6-9365-45ba-99a7-db9a11b3e7ca","Type":"ContainerDied","Data":"64b8c332c33785df9df5ab43816e7237c49b86cabae3836824864707c5684a45"} Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.410501 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64b8c332c33785df9df5ab43816e7237c49b86cabae3836824864707c5684a45" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.411059 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8lhkg" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.522661 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 07 07:14:43 crc kubenswrapper[4941]: E0307 07:14:43.523142 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ceb33a6-9365-45ba-99a7-db9a11b3e7ca" containerName="nova-cell0-conductor-db-sync" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.523162 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ceb33a6-9365-45ba-99a7-db9a11b3e7ca" containerName="nova-cell0-conductor-db-sync" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.523418 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ceb33a6-9365-45ba-99a7-db9a11b3e7ca" containerName="nova-cell0-conductor-db-sync" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.524107 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.526420 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bv2tx" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.526761 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.532461 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.614857 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\") " pod="openstack/nova-cell0-conductor-0" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.615210 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\") " pod="openstack/nova-cell0-conductor-0" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.615510 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wv4m\" (UniqueName: \"kubernetes.io/projected/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-kube-api-access-5wv4m\") pod \"nova-cell0-conductor-0\" (UID: \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\") " pod="openstack/nova-cell0-conductor-0" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.717512 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wv4m\" (UniqueName: \"kubernetes.io/projected/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-kube-api-access-5wv4m\") pod \"nova-cell0-conductor-0\" (UID: \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\") " pod="openstack/nova-cell0-conductor-0" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.717901 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\") " pod="openstack/nova-cell0-conductor-0" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.718119 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\") " pod="openstack/nova-cell0-conductor-0" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.724324 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\") " pod="openstack/nova-cell0-conductor-0" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.725938 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\") " pod="openstack/nova-cell0-conductor-0" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.750295 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wv4m\" (UniqueName: \"kubernetes.io/projected/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-kube-api-access-5wv4m\") pod \"nova-cell0-conductor-0\" (UID: \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\") " pod="openstack/nova-cell0-conductor-0" Mar 07 07:14:43 crc kubenswrapper[4941]: I0307 07:14:43.847899 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 07 07:14:44 crc kubenswrapper[4941]: I0307 07:14:44.312896 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 07 07:14:44 crc kubenswrapper[4941]: I0307 07:14:44.420221 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20","Type":"ContainerStarted","Data":"ba2cd2c70b60073262a137329b590ea441c9826a1e5ac8aec0621a560eaf1650"} Mar 07 07:14:45 crc kubenswrapper[4941]: I0307 07:14:45.433916 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20","Type":"ContainerStarted","Data":"d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e"} Mar 07 07:14:45 crc kubenswrapper[4941]: I0307 07:14:45.434215 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 07 07:14:45 crc kubenswrapper[4941]: I0307 07:14:45.460663 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.460636095 podStartE2EDuration="2.460636095s" podCreationTimestamp="2026-03-07 07:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:14:45.456296806 +0000 UTC m=+1382.408662331" watchObservedRunningTime="2026-03-07 07:14:45.460636095 +0000 UTC m=+1382.413001600" Mar 07 07:14:48 crc kubenswrapper[4941]: E0307 07:14:48.376177 4941 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/84b19ee5a55ba0a089f09e48335bf9c467e619781c0aaf66fcfe53301e49c68e/diff" to get inode usage: stat /var/lib/containers/storage/overlay/84b19ee5a55ba0a089f09e48335bf9c467e619781c0aaf66fcfe53301e49c68e/diff: no such file or directory, extraDiskErr: Mar 07 07:14:53 crc kubenswrapper[4941]: I0307 07:14:53.896474 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.338574 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gt9jr"] Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.341009 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.343108 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.350731 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.356996 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gt9jr"] Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.507010 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-scripts\") pod \"nova-cell0-cell-mapping-gt9jr\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.507203 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gt9jr\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.507243 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xcvx\" (UniqueName: \"kubernetes.io/projected/672d89d2-46b4-449f-ad71-2716d50eb2fe-kube-api-access-7xcvx\") pod \"nova-cell0-cell-mapping-gt9jr\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.507357 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-config-data\") pod \"nova-cell0-cell-mapping-gt9jr\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.545733 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.548721 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.551006 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.605629 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.609084 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gt9jr\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.609121 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xcvx\" (UniqueName: \"kubernetes.io/projected/672d89d2-46b4-449f-ad71-2716d50eb2fe-kube-api-access-7xcvx\") pod \"nova-cell0-cell-mapping-gt9jr\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.609195 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-config-data\") pod \"nova-cell0-cell-mapping-gt9jr\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.609251 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-scripts\") pod \"nova-cell0-cell-mapping-gt9jr\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.632848 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gt9jr\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.654024 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-scripts\") pod \"nova-cell0-cell-mapping-gt9jr\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.654166 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.654415 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-config-data\") pod \"nova-cell0-cell-mapping-gt9jr\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.664916 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xcvx\" (UniqueName: \"kubernetes.io/projected/672d89d2-46b4-449f-ad71-2716d50eb2fe-kube-api-access-7xcvx\") pod \"nova-cell0-cell-mapping-gt9jr\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.669208 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.670660 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.672755 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.703834 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.725049 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6712cc3f-2e68-4522-8624-84903da6d19d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.725089 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgtl7\" (UniqueName: \"kubernetes.io/projected/6712cc3f-2e68-4522-8624-84903da6d19d-kube-api-access-qgtl7\") pod \"nova-metadata-0\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.725120 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0954e7-1af0-4363-9fd4-8713146c0bc6-config-data\") pod \"nova-api-0\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " pod="openstack/nova-api-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.725164 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d0954e7-1af0-4363-9fd4-8713146c0bc6-logs\") pod \"nova-api-0\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " pod="openstack/nova-api-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.725193 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6712cc3f-2e68-4522-8624-84903da6d19d-logs\") pod \"nova-metadata-0\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.725206 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0954e7-1af0-4363-9fd4-8713146c0bc6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " pod="openstack/nova-api-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.725224 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hct2z\" (UniqueName: \"kubernetes.io/projected/8d0954e7-1af0-4363-9fd4-8713146c0bc6-kube-api-access-hct2z\") pod \"nova-api-0\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " pod="openstack/nova-api-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.725249 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6712cc3f-2e68-4522-8624-84903da6d19d-config-data\") pod \"nova-metadata-0\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.774461 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.775875 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.782188 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.785771 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.815378 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-97cdf8549-fxq48"] Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.825122 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.825204 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-97cdf8549-fxq48"] Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.826623 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6712cc3f-2e68-4522-8624-84903da6d19d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.826651 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgtl7\" (UniqueName: \"kubernetes.io/projected/6712cc3f-2e68-4522-8624-84903da6d19d-kube-api-access-qgtl7\") pod \"nova-metadata-0\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.826681 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0954e7-1af0-4363-9fd4-8713146c0bc6-config-data\") pod \"nova-api-0\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " pod="openstack/nova-api-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.826728 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d0954e7-1af0-4363-9fd4-8713146c0bc6-logs\") pod \"nova-api-0\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " pod="openstack/nova-api-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.826756 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6712cc3f-2e68-4522-8624-84903da6d19d-logs\") pod \"nova-metadata-0\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.826769 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0954e7-1af0-4363-9fd4-8713146c0bc6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " pod="openstack/nova-api-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.826806 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hct2z\" (UniqueName: \"kubernetes.io/projected/8d0954e7-1af0-4363-9fd4-8713146c0bc6-kube-api-access-hct2z\") pod \"nova-api-0\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " pod="openstack/nova-api-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.826826 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6712cc3f-2e68-4522-8624-84903da6d19d-config-data\") pod \"nova-metadata-0\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.827831 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d0954e7-1af0-4363-9fd4-8713146c0bc6-logs\") pod \"nova-api-0\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " pod="openstack/nova-api-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.834828 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6712cc3f-2e68-4522-8624-84903da6d19d-config-data\") pod \"nova-metadata-0\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.839669 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6712cc3f-2e68-4522-8624-84903da6d19d-logs\") pod \"nova-metadata-0\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.840320 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.844769 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.853139 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.854581 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0954e7-1af0-4363-9fd4-8713146c0bc6-config-data\") pod \"nova-api-0\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " pod="openstack/nova-api-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.855244 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.857614 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0954e7-1af0-4363-9fd4-8713146c0bc6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " pod="openstack/nova-api-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.859435 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6712cc3f-2e68-4522-8624-84903da6d19d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.879997 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgtl7\" (UniqueName: \"kubernetes.io/projected/6712cc3f-2e68-4522-8624-84903da6d19d-kube-api-access-qgtl7\") pod \"nova-metadata-0\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.887998 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hct2z\" (UniqueName: \"kubernetes.io/projected/8d0954e7-1af0-4363-9fd4-8713146c0bc6-kube-api-access-hct2z\") pod \"nova-api-0\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " pod="openstack/nova-api-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.913808 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.929272 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-config\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.929313 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtq7d\" (UniqueName: \"kubernetes.io/projected/74e3be4e-1078-4655-bd5f-f5a7c6550256-kube-api-access-mtq7d\") pod \"nova-scheduler-0\" (UID: \"74e3be4e-1078-4655-bd5f-f5a7c6550256\") " pod="openstack/nova-scheduler-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.929356 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5f75\" (UniqueName: \"kubernetes.io/projected/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-kube-api-access-d5f75\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.929376 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-dns-swift-storage-0\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.929417 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-ovsdbserver-sb\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.929435 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-ovsdbserver-nb\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.929470 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e3be4e-1078-4655-bd5f-f5a7c6550256-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"74e3be4e-1078-4655-bd5f-f5a7c6550256\") " pod="openstack/nova-scheduler-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.929554 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-dns-svc\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.929611 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e3be4e-1078-4655-bd5f-f5a7c6550256-config-data\") pod \"nova-scheduler-0\" (UID: \"74e3be4e-1078-4655-bd5f-f5a7c6550256\") " pod="openstack/nova-scheduler-0" Mar 07 07:14:54 crc kubenswrapper[4941]: I0307 07:14:54.962801 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.032524 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6edec94-977f-493a-97b6-f90e02d07467-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6edec94-977f-493a-97b6-f90e02d07467\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.032614 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-dns-svc\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.032668 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e3be4e-1078-4655-bd5f-f5a7c6550256-config-data\") pod \"nova-scheduler-0\" (UID: \"74e3be4e-1078-4655-bd5f-f5a7c6550256\") " pod="openstack/nova-scheduler-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.033337 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqcp6\" (UniqueName: \"kubernetes.io/projected/a6edec94-977f-493a-97b6-f90e02d07467-kube-api-access-zqcp6\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6edec94-977f-493a-97b6-f90e02d07467\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.033429 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-config\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.033461 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtq7d\" (UniqueName: \"kubernetes.io/projected/74e3be4e-1078-4655-bd5f-f5a7c6550256-kube-api-access-mtq7d\") pod \"nova-scheduler-0\" (UID: \"74e3be4e-1078-4655-bd5f-f5a7c6550256\") " pod="openstack/nova-scheduler-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.033515 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5f75\" (UniqueName: \"kubernetes.io/projected/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-kube-api-access-d5f75\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.033531 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-dns-swift-storage-0\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.033578 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-dns-svc\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.033587 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-ovsdbserver-sb\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.033653 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-ovsdbserver-nb\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.033690 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6edec94-977f-493a-97b6-f90e02d07467-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6edec94-977f-493a-97b6-f90e02d07467\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.033770 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e3be4e-1078-4655-bd5f-f5a7c6550256-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"74e3be4e-1078-4655-bd5f-f5a7c6550256\") " pod="openstack/nova-scheduler-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.034474 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-ovsdbserver-sb\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.034836 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-ovsdbserver-nb\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.035089 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-config\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.035304 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-dns-swift-storage-0\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.038557 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e3be4e-1078-4655-bd5f-f5a7c6550256-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"74e3be4e-1078-4655-bd5f-f5a7c6550256\") " pod="openstack/nova-scheduler-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.039756 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e3be4e-1078-4655-bd5f-f5a7c6550256-config-data\") pod \"nova-scheduler-0\" (UID: \"74e3be4e-1078-4655-bd5f-f5a7c6550256\") " pod="openstack/nova-scheduler-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.053103 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtq7d\" (UniqueName: \"kubernetes.io/projected/74e3be4e-1078-4655-bd5f-f5a7c6550256-kube-api-access-mtq7d\") pod \"nova-scheduler-0\" (UID: \"74e3be4e-1078-4655-bd5f-f5a7c6550256\") " pod="openstack/nova-scheduler-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.055008 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5f75\" (UniqueName: \"kubernetes.io/projected/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-kube-api-access-d5f75\") pod \"dnsmasq-dns-97cdf8549-fxq48\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.091369 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.121993 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.135219 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6edec94-977f-493a-97b6-f90e02d07467-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6edec94-977f-493a-97b6-f90e02d07467\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.135282 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6edec94-977f-493a-97b6-f90e02d07467-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6edec94-977f-493a-97b6-f90e02d07467\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.135331 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqcp6\" (UniqueName: \"kubernetes.io/projected/a6edec94-977f-493a-97b6-f90e02d07467-kube-api-access-zqcp6\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6edec94-977f-493a-97b6-f90e02d07467\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.141116 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6edec94-977f-493a-97b6-f90e02d07467-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6edec94-977f-493a-97b6-f90e02d07467\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.144001 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6edec94-977f-493a-97b6-f90e02d07467-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6edec94-977f-493a-97b6-f90e02d07467\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.158087 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqcp6\" (UniqueName: \"kubernetes.io/projected/a6edec94-977f-493a-97b6-f90e02d07467-kube-api-access-zqcp6\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6edec94-977f-493a-97b6-f90e02d07467\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.264202 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.271186 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.456330 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n4cfj"] Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.457398 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.473200 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.473250 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.489582 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n4cfj"] Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.520740 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.531511 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gt9jr"] Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.651777 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drv9j\" (UniqueName: \"kubernetes.io/projected/2d028683-343b-490d-9790-202e64e4e721-kube-api-access-drv9j\") pod \"nova-cell1-conductor-db-sync-n4cfj\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.651876 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-scripts\") pod \"nova-cell1-conductor-db-sync-n4cfj\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.651930 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-n4cfj\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.651976 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-config-data\") pod \"nova-cell1-conductor-db-sync-n4cfj\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.735188 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.753866 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-n4cfj\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.753949 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-config-data\") pod \"nova-cell1-conductor-db-sync-n4cfj\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.754165 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drv9j\" (UniqueName: \"kubernetes.io/projected/2d028683-343b-490d-9790-202e64e4e721-kube-api-access-drv9j\") pod \"nova-cell1-conductor-db-sync-n4cfj\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.754255 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-scripts\") pod \"nova-cell1-conductor-db-sync-n4cfj\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.766249 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-config-data\") pod \"nova-cell1-conductor-db-sync-n4cfj\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.770523 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-scripts\") pod \"nova-cell1-conductor-db-sync-n4cfj\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.772781 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.774810 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-n4cfj\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.794655 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drv9j\" (UniqueName: \"kubernetes.io/projected/2d028683-343b-490d-9790-202e64e4e721-kube-api-access-drv9j\") pod \"nova-cell1-conductor-db-sync-n4cfj\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:55 crc kubenswrapper[4941]: I0307 07:14:55.900107 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-97cdf8549-fxq48"] Mar 07 07:14:56 crc kubenswrapper[4941]: I0307 07:14:56.026434 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 07:14:56 crc kubenswrapper[4941]: I0307 07:14:56.092097 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:14:56 crc kubenswrapper[4941]: I0307 07:14:56.549736 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6712cc3f-2e68-4522-8624-84903da6d19d","Type":"ContainerStarted","Data":"8abc501831e1c09dad8a5ba8bb62774660563f5b0122251592f1276911ff2bcf"} Mar 07 07:14:56 crc kubenswrapper[4941]: I0307 07:14:56.553341 4941 generic.go:334] "Generic (PLEG): container finished" podID="90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" containerID="782b4cb6800a49583e5d4febde0e2f055e88966b8bd69833d2b9828e747c886f" exitCode=0 Mar 07 07:14:56 crc kubenswrapper[4941]: I0307 07:14:56.553427 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-97cdf8549-fxq48" event={"ID":"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d","Type":"ContainerDied","Data":"782b4cb6800a49583e5d4febde0e2f055e88966b8bd69833d2b9828e747c886f"} Mar 07 07:14:56 crc kubenswrapper[4941]: I0307 07:14:56.553451 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-97cdf8549-fxq48" event={"ID":"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d","Type":"ContainerStarted","Data":"afec75b51812e99bc9cb9d8b7a72ad1a28d60db8ede1e30e8b75d4a67ca96ef1"} Mar 07 07:14:56 crc kubenswrapper[4941]: I0307 07:14:56.554600 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gt9jr" event={"ID":"672d89d2-46b4-449f-ad71-2716d50eb2fe","Type":"ContainerStarted","Data":"c31df48ec45a74d7943bce634439f98715ea8900b473d4e84dcbc0cbfcb889ec"} Mar 07 07:14:56 crc kubenswrapper[4941]: I0307 07:14:56.554648 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gt9jr" event={"ID":"672d89d2-46b4-449f-ad71-2716d50eb2fe","Type":"ContainerStarted","Data":"4f5058965f354e843b1ebd82108ea85c1f068002cfa9626bf3e9a03e5cf5d42e"} Mar 07 07:14:56 crc kubenswrapper[4941]: I0307 07:14:56.713271 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"74e3be4e-1078-4655-bd5f-f5a7c6550256","Type":"ContainerStarted","Data":"850ff261d488ad60c7f02beae257ccfb3e65161de267906209b49038db815495"} Mar 07 07:14:56 crc kubenswrapper[4941]: I0307 07:14:56.720190 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a6edec94-977f-493a-97b6-f90e02d07467","Type":"ContainerStarted","Data":"dc6950657c5ab8fb8999ca939674220f8a785809d3a9cd68c61e8c9952acf6bc"} Mar 07 07:14:57 crc kubenswrapper[4941]: I0307 07:14:57.316289 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d0954e7-1af0-4363-9fd4-8713146c0bc6","Type":"ContainerStarted","Data":"d7cee575f710d6240b13792cf55e9dca24d976c6d9c352cbce3782b1981bb01d"} Mar 07 07:14:57 crc kubenswrapper[4941]: I0307 07:14:57.316599 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gt9jr" podStartSLOduration=3.316579596 podStartE2EDuration="3.316579596s" podCreationTimestamp="2026-03-07 07:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:14:56.761019681 +0000 UTC m=+1393.713385146" watchObservedRunningTime="2026-03-07 07:14:57.316579596 +0000 UTC m=+1394.268945061" Mar 07 07:14:57 crc kubenswrapper[4941]: I0307 07:14:57.325934 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n4cfj"] Mar 07 07:14:58 crc kubenswrapper[4941]: I0307 07:14:58.336711 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n4cfj" event={"ID":"2d028683-343b-490d-9790-202e64e4e721","Type":"ContainerStarted","Data":"163550a6742880ef66e9f9a94958c2627b1e920b68844505b999087c43e31cbf"} Mar 07 07:14:58 crc kubenswrapper[4941]: I0307 07:14:58.337311 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n4cfj" event={"ID":"2d028683-343b-490d-9790-202e64e4e721","Type":"ContainerStarted","Data":"92914de9a5d5f0b8e9ad416f125ec49d2055a4f6cd61f557001b37c6e882485b"} Mar 07 07:14:58 crc kubenswrapper[4941]: I0307 07:14:58.352064 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-97cdf8549-fxq48" event={"ID":"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d","Type":"ContainerStarted","Data":"b2176482b6780188feae01350fd7d7388cc5f70b264e779954fd672db582c776"} Mar 07 07:14:58 crc kubenswrapper[4941]: I0307 07:14:58.352117 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:14:58 crc kubenswrapper[4941]: I0307 07:14:58.372526 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-n4cfj" podStartSLOduration=3.372509063 podStartE2EDuration="3.372509063s" podCreationTimestamp="2026-03-07 07:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:14:58.359065394 +0000 UTC m=+1395.311430859" watchObservedRunningTime="2026-03-07 07:14:58.372509063 +0000 UTC m=+1395.324874528" Mar 07 07:14:58 crc kubenswrapper[4941]: I0307 07:14:58.384222 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-97cdf8549-fxq48" podStartSLOduration=4.384198418 podStartE2EDuration="4.384198418s" podCreationTimestamp="2026-03-07 07:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:14:58.377103379 +0000 UTC m=+1395.329468864" watchObservedRunningTime="2026-03-07 07:14:58.384198418 +0000 UTC m=+1395.336563883" Mar 07 07:14:59 crc kubenswrapper[4941]: I0307 07:14:59.246419 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:14:59 crc kubenswrapper[4941]: I0307 07:14:59.259486 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.150506 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn"] Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.152333 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.160370 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn"] Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.162963 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.163591 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.237039 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-secret-volume\") pod \"collect-profiles-29547795-c47qn\" (UID: \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.237081 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-config-volume\") pod \"collect-profiles-29547795-c47qn\" (UID: \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.237121 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qlcd\" (UniqueName: \"kubernetes.io/projected/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-kube-api-access-6qlcd\") pod \"collect-profiles-29547795-c47qn\" (UID: \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.338299 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-secret-volume\") pod \"collect-profiles-29547795-c47qn\" (UID: \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.338549 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-config-volume\") pod \"collect-profiles-29547795-c47qn\" (UID: \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.338645 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qlcd\" (UniqueName: \"kubernetes.io/projected/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-kube-api-access-6qlcd\") pod \"collect-profiles-29547795-c47qn\" (UID: \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.339986 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-config-volume\") pod \"collect-profiles-29547795-c47qn\" (UID: \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.343326 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-secret-volume\") pod \"collect-profiles-29547795-c47qn\" (UID: \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.354669 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qlcd\" (UniqueName: \"kubernetes.io/projected/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-kube-api-access-6qlcd\") pod \"collect-profiles-29547795-c47qn\" (UID: \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.376692 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6712cc3f-2e68-4522-8624-84903da6d19d","Type":"ContainerStarted","Data":"9f3fb6ba0495b23cf3bd3f4a5cbc0981199748f055976e7d3a133030731c406e"} Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.376765 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6712cc3f-2e68-4522-8624-84903da6d19d","Type":"ContainerStarted","Data":"5e1a08351cf1bf1c631dfbb909420b694b7c3ab021d1cb299980c16236829080"} Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.376805 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6712cc3f-2e68-4522-8624-84903da6d19d" containerName="nova-metadata-log" containerID="cri-o://5e1a08351cf1bf1c631dfbb909420b694b7c3ab021d1cb299980c16236829080" gracePeriod=30 Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.376935 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6712cc3f-2e68-4522-8624-84903da6d19d" containerName="nova-metadata-metadata" containerID="cri-o://9f3fb6ba0495b23cf3bd3f4a5cbc0981199748f055976e7d3a133030731c406e" gracePeriod=30 Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.380608 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"74e3be4e-1078-4655-bd5f-f5a7c6550256","Type":"ContainerStarted","Data":"6f3804b8a6c46d0c9e05448a2f6a500a22186213382946e0cc47cd06b5c37fbe"} Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.383731 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a6edec94-977f-493a-97b6-f90e02d07467","Type":"ContainerStarted","Data":"4538e4358b9525e75783e8afd80be20d7156046fb83eecb859f5b485afa2078b"} Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.383866 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a6edec94-977f-493a-97b6-f90e02d07467" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4538e4358b9525e75783e8afd80be20d7156046fb83eecb859f5b485afa2078b" gracePeriod=30 Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.393900 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d0954e7-1af0-4363-9fd4-8713146c0bc6","Type":"ContainerStarted","Data":"a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3"} Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.393953 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d0954e7-1af0-4363-9fd4-8713146c0bc6","Type":"ContainerStarted","Data":"95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe"} Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.398255 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.21663215 podStartE2EDuration="6.39815242s" podCreationTimestamp="2026-03-07 07:14:54 +0000 UTC" firstStartedPulling="2026-03-07 07:14:55.539969997 +0000 UTC m=+1392.492335462" lastFinishedPulling="2026-03-07 07:14:59.721490267 +0000 UTC m=+1396.673855732" observedRunningTime="2026-03-07 07:15:00.394864886 +0000 UTC m=+1397.347230361" watchObservedRunningTime="2026-03-07 07:15:00.39815242 +0000 UTC m=+1397.350517885" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.455765 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.549010789 podStartE2EDuration="6.455745013s" podCreationTimestamp="2026-03-07 07:14:54 +0000 UTC" firstStartedPulling="2026-03-07 07:14:55.808529986 +0000 UTC m=+1392.760895441" lastFinishedPulling="2026-03-07 07:14:59.7152642 +0000 UTC m=+1396.667629665" observedRunningTime="2026-03-07 07:15:00.43383789 +0000 UTC m=+1397.386203355" watchObservedRunningTime="2026-03-07 07:15:00.455745013 +0000 UTC m=+1397.408110478" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.457017 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.535198861 podStartE2EDuration="6.457011555s" podCreationTimestamp="2026-03-07 07:14:54 +0000 UTC" firstStartedPulling="2026-03-07 07:14:55.794442771 +0000 UTC m=+1392.746808236" lastFinishedPulling="2026-03-07 07:14:59.716255435 +0000 UTC m=+1396.668620930" observedRunningTime="2026-03-07 07:15:00.453657231 +0000 UTC m=+1397.406022696" watchObservedRunningTime="2026-03-07 07:15:00.457011555 +0000 UTC m=+1397.409377020" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.472261 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.74232489 podStartE2EDuration="6.4722429s" podCreationTimestamp="2026-03-07 07:14:54 +0000 UTC" firstStartedPulling="2026-03-07 07:14:56.043534779 +0000 UTC m=+1392.995900244" lastFinishedPulling="2026-03-07 07:14:59.773452779 +0000 UTC m=+1396.725818254" observedRunningTime="2026-03-07 07:15:00.47184353 +0000 UTC m=+1397.424209015" watchObservedRunningTime="2026-03-07 07:15:00.4722429 +0000 UTC m=+1397.424608365" Mar 07 07:15:00 crc kubenswrapper[4941]: I0307 07:15:00.480766 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" Mar 07 07:15:01 crc kubenswrapper[4941]: I0307 07:15:01.012655 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn"] Mar 07 07:15:01 crc kubenswrapper[4941]: I0307 07:15:01.405775 4941 generic.go:334] "Generic (PLEG): container finished" podID="6712cc3f-2e68-4522-8624-84903da6d19d" containerID="5e1a08351cf1bf1c631dfbb909420b694b7c3ab021d1cb299980c16236829080" exitCode=143 Mar 07 07:15:01 crc kubenswrapper[4941]: I0307 07:15:01.405856 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6712cc3f-2e68-4522-8624-84903da6d19d","Type":"ContainerDied","Data":"5e1a08351cf1bf1c631dfbb909420b694b7c3ab021d1cb299980c16236829080"} Mar 07 07:15:01 crc kubenswrapper[4941]: I0307 07:15:01.408800 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" event={"ID":"e7b0380d-e6d2-473f-a49b-bdccb4747ccc","Type":"ContainerStarted","Data":"65dd08c7d05870e8d4e5cd7a8abbbafcd882e2d57e5822eebbee49d373a690f3"} Mar 07 07:15:01 crc kubenswrapper[4941]: I0307 07:15:01.408864 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" event={"ID":"e7b0380d-e6d2-473f-a49b-bdccb4747ccc","Type":"ContainerStarted","Data":"a374c3b52e8f68c8121633839a350c75bf1664b2d644d09c6d221903315a9d1d"} Mar 07 07:15:02 crc kubenswrapper[4941]: I0307 07:15:02.418311 4941 generic.go:334] "Generic (PLEG): container finished" podID="e7b0380d-e6d2-473f-a49b-bdccb4747ccc" containerID="65dd08c7d05870e8d4e5cd7a8abbbafcd882e2d57e5822eebbee49d373a690f3" exitCode=0 Mar 07 07:15:02 crc kubenswrapper[4941]: I0307 07:15:02.418357 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" event={"ID":"e7b0380d-e6d2-473f-a49b-bdccb4747ccc","Type":"ContainerDied","Data":"65dd08c7d05870e8d4e5cd7a8abbbafcd882e2d57e5822eebbee49d373a690f3"} Mar 07 07:15:03 crc kubenswrapper[4941]: I0307 07:15:03.934113 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.015729 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-config-volume\") pod \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\" (UID: \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\") " Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.015976 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qlcd\" (UniqueName: \"kubernetes.io/projected/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-kube-api-access-6qlcd\") pod \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\" (UID: \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\") " Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.016170 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-secret-volume\") pod \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\" (UID: \"e7b0380d-e6d2-473f-a49b-bdccb4747ccc\") " Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.017154 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-config-volume" (OuterVolumeSpecName: "config-volume") pod "e7b0380d-e6d2-473f-a49b-bdccb4747ccc" (UID: "e7b0380d-e6d2-473f-a49b-bdccb4747ccc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.023450 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-kube-api-access-6qlcd" (OuterVolumeSpecName: "kube-api-access-6qlcd") pod "e7b0380d-e6d2-473f-a49b-bdccb4747ccc" (UID: "e7b0380d-e6d2-473f-a49b-bdccb4747ccc"). InnerVolumeSpecName "kube-api-access-6qlcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.023499 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e7b0380d-e6d2-473f-a49b-bdccb4747ccc" (UID: "e7b0380d-e6d2-473f-a49b-bdccb4747ccc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.121284 4941 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.121372 4941 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.121391 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qlcd\" (UniqueName: \"kubernetes.io/projected/e7b0380d-e6d2-473f-a49b-bdccb4747ccc-kube-api-access-6qlcd\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:04 crc kubenswrapper[4941]: W0307 07:15:04.364384 4941 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90fbfdfb_e78c_4f2c_bfc5_5b8e6597637d.slice/crio-conmon-782b4cb6800a49583e5d4febde0e2f055e88966b8bd69833d2b9828e747c886f.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90fbfdfb_e78c_4f2c_bfc5_5b8e6597637d.slice/crio-conmon-782b4cb6800a49583e5d4febde0e2f055e88966b8bd69833d2b9828e747c886f.scope: no such file or directory Mar 07 07:15:04 crc kubenswrapper[4941]: W0307 07:15:04.364505 4941 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90fbfdfb_e78c_4f2c_bfc5_5b8e6597637d.slice/crio-782b4cb6800a49583e5d4febde0e2f055e88966b8bd69833d2b9828e747c886f.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90fbfdfb_e78c_4f2c_bfc5_5b8e6597637d.slice/crio-782b4cb6800a49583e5d4febde0e2f055e88966b8bd69833d2b9828e747c886f.scope: no such file or directory Mar 07 07:15:04 crc kubenswrapper[4941]: W0307 07:15:04.372074 4941 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6712cc3f_2e68_4522_8624_84903da6d19d.slice/crio-conmon-5e1a08351cf1bf1c631dfbb909420b694b7c3ab021d1cb299980c16236829080.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6712cc3f_2e68_4522_8624_84903da6d19d.slice/crio-conmon-5e1a08351cf1bf1c631dfbb909420b694b7c3ab021d1cb299980c16236829080.scope: no such file or directory Mar 07 07:15:04 crc kubenswrapper[4941]: W0307 07:15:04.374703 4941 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6712cc3f_2e68_4522_8624_84903da6d19d.slice/crio-5e1a08351cf1bf1c631dfbb909420b694b7c3ab021d1cb299980c16236829080.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6712cc3f_2e68_4522_8624_84903da6d19d.slice/crio-5e1a08351cf1bf1c631dfbb909420b694b7c3ab021d1cb299980c16236829080.scope: no such file or directory Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.446531 4941 generic.go:334] "Generic (PLEG): container finished" podID="672d89d2-46b4-449f-ad71-2716d50eb2fe" containerID="c31df48ec45a74d7943bce634439f98715ea8900b473d4e84dcbc0cbfcb889ec" exitCode=0 Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.446620 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gt9jr" event={"ID":"672d89d2-46b4-449f-ad71-2716d50eb2fe","Type":"ContainerDied","Data":"c31df48ec45a74d7943bce634439f98715ea8900b473d4e84dcbc0cbfcb889ec"} Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.451501 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" event={"ID":"e7b0380d-e6d2-473f-a49b-bdccb4747ccc","Type":"ContainerDied","Data":"a374c3b52e8f68c8121633839a350c75bf1664b2d644d09c6d221903315a9d1d"} Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.451544 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a374c3b52e8f68c8121633839a350c75bf1664b2d644d09c6d221903315a9d1d" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.451507 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.475194 4941 generic.go:334] "Generic (PLEG): container finished" podID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerID="c58e5b9c34ddbe30187350ca2343ceb9e751aeabfefecaacc487d2225fa6920e" exitCode=137 Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.475245 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ab97627-6b7f-4984-af5f-a732f13b2486","Type":"ContainerDied","Data":"c58e5b9c34ddbe30187350ca2343ceb9e751aeabfefecaacc487d2225fa6920e"} Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.656864 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.704098 4941 scope.go:117] "RemoveContainer" containerID="bb79d1d1306c286d1e351ed47763214d34ab77347dd00544da397bc6226f5eca" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.734554 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-combined-ca-bundle\") pod \"2ab97627-6b7f-4984-af5f-a732f13b2486\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.734663 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn65p\" (UniqueName: \"kubernetes.io/projected/2ab97627-6b7f-4984-af5f-a732f13b2486-kube-api-access-kn65p\") pod \"2ab97627-6b7f-4984-af5f-a732f13b2486\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.734726 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-scripts\") pod \"2ab97627-6b7f-4984-af5f-a732f13b2486\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.734763 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-config-data\") pod \"2ab97627-6b7f-4984-af5f-a732f13b2486\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.734813 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ab97627-6b7f-4984-af5f-a732f13b2486-run-httpd\") pod \"2ab97627-6b7f-4984-af5f-a732f13b2486\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.734877 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ab97627-6b7f-4984-af5f-a732f13b2486-log-httpd\") pod \"2ab97627-6b7f-4984-af5f-a732f13b2486\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.734959 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-sg-core-conf-yaml\") pod \"2ab97627-6b7f-4984-af5f-a732f13b2486\" (UID: \"2ab97627-6b7f-4984-af5f-a732f13b2486\") " Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.735791 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab97627-6b7f-4984-af5f-a732f13b2486-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2ab97627-6b7f-4984-af5f-a732f13b2486" (UID: "2ab97627-6b7f-4984-af5f-a732f13b2486"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.735971 4941 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ab97627-6b7f-4984-af5f-a732f13b2486-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.735975 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab97627-6b7f-4984-af5f-a732f13b2486-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2ab97627-6b7f-4984-af5f-a732f13b2486" (UID: "2ab97627-6b7f-4984-af5f-a732f13b2486"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.738278 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab97627-6b7f-4984-af5f-a732f13b2486-kube-api-access-kn65p" (OuterVolumeSpecName: "kube-api-access-kn65p") pod "2ab97627-6b7f-4984-af5f-a732f13b2486" (UID: "2ab97627-6b7f-4984-af5f-a732f13b2486"). InnerVolumeSpecName "kube-api-access-kn65p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.738722 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-scripts" (OuterVolumeSpecName: "scripts") pod "2ab97627-6b7f-4984-af5f-a732f13b2486" (UID: "2ab97627-6b7f-4984-af5f-a732f13b2486"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.765391 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2ab97627-6b7f-4984-af5f-a732f13b2486" (UID: "2ab97627-6b7f-4984-af5f-a732f13b2486"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.812571 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ab97627-6b7f-4984-af5f-a732f13b2486" (UID: "2ab97627-6b7f-4984-af5f-a732f13b2486"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.833476 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-config-data" (OuterVolumeSpecName: "config-data") pod "2ab97627-6b7f-4984-af5f-a732f13b2486" (UID: "2ab97627-6b7f-4984-af5f-a732f13b2486"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.837587 4941 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.837618 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.837627 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn65p\" (UniqueName: \"kubernetes.io/projected/2ab97627-6b7f-4984-af5f-a732f13b2486-kube-api-access-kn65p\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.837640 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.837649 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab97627-6b7f-4984-af5f-a732f13b2486-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.837658 4941 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ab97627-6b7f-4984-af5f-a732f13b2486-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.915134 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 07 07:15:04 crc kubenswrapper[4941]: I0307 07:15:04.915178 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.092058 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.092113 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.123652 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.123702 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.161561 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.266571 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.272147 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.335188 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cbf7756bf-kxnb4"] Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.335465 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" podUID="ce3f0f8f-0b63-4775-a794-adb5f51cfe66" containerName="dnsmasq-dns" containerID="cri-o://32c5917b694cba85411826ec18fe5346f9c4d4e9441c2f979e5480f971e8e488" gracePeriod=10 Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.493721 4941 generic.go:334] "Generic (PLEG): container finished" podID="2d028683-343b-490d-9790-202e64e4e721" containerID="163550a6742880ef66e9f9a94958c2627b1e920b68844505b999087c43e31cbf" exitCode=0 Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.493815 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n4cfj" event={"ID":"2d028683-343b-490d-9790-202e64e4e721","Type":"ContainerDied","Data":"163550a6742880ef66e9f9a94958c2627b1e920b68844505b999087c43e31cbf"} Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.510193 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ab97627-6b7f-4984-af5f-a732f13b2486","Type":"ContainerDied","Data":"accece7c95c66a08119647336a9c3a2bf1fa2a3706527a1b2e0a8b33c272492d"} Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.510222 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.510252 4941 scope.go:117] "RemoveContainer" containerID="c58e5b9c34ddbe30187350ca2343ceb9e751aeabfefecaacc487d2225fa6920e" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.514432 4941 generic.go:334] "Generic (PLEG): container finished" podID="ce3f0f8f-0b63-4775-a794-adb5f51cfe66" containerID="32c5917b694cba85411826ec18fe5346f9c4d4e9441c2f979e5480f971e8e488" exitCode=0 Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.514673 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" event={"ID":"ce3f0f8f-0b63-4775-a794-adb5f51cfe66","Type":"ContainerDied","Data":"32c5917b694cba85411826ec18fe5346f9c4d4e9441c2f979e5480f971e8e488"} Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.546973 4941 scope.go:117] "RemoveContainer" containerID="f29ef3767e472ba307a6f90a5ede0e8a49f38ed37e729ca8db1ef166df3f9e5c" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.550237 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.568612 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.581046 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:05 crc kubenswrapper[4941]: E0307 07:15:05.581527 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="ceilometer-notification-agent" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.581546 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="ceilometer-notification-agent" Mar 07 07:15:05 crc kubenswrapper[4941]: E0307 07:15:05.581561 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="proxy-httpd" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.581569 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="proxy-httpd" Mar 07 07:15:05 crc kubenswrapper[4941]: E0307 07:15:05.581582 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="sg-core" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.581589 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="sg-core" Mar 07 07:15:05 crc kubenswrapper[4941]: E0307 07:15:05.581610 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b0380d-e6d2-473f-a49b-bdccb4747ccc" containerName="collect-profiles" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.581617 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b0380d-e6d2-473f-a49b-bdccb4747ccc" containerName="collect-profiles" Mar 07 07:15:05 crc kubenswrapper[4941]: E0307 07:15:05.581653 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="ceilometer-central-agent" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.581661 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="ceilometer-central-agent" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.581872 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="ceilometer-central-agent" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.581919 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="sg-core" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.581937 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="ceilometer-notification-agent" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.581955 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" containerName="proxy-httpd" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.581972 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b0380d-e6d2-473f-a49b-bdccb4747ccc" containerName="collect-profiles" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.592592 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.599939 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.602505 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.622023 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.628975 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.633134 4941 scope.go:117] "RemoveContainer" containerID="4644b7abc8347fbcc44accca2ab2f75d488b41012a2fc743e0f3d6b3860cf462" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.659671 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-config-data\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.659755 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-scripts\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.660282 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.660443 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-run-httpd\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.660504 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-log-httpd\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.660546 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsgwh\" (UniqueName: \"kubernetes.io/projected/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-kube-api-access-xsgwh\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.660607 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.670637 4941 scope.go:117] "RemoveContainer" containerID="ea3be111ebf6ccdd8344a06aaac53960cb585205448af6727079dd15cb825d66" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.766477 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-log-httpd\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.766549 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsgwh\" (UniqueName: \"kubernetes.io/projected/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-kube-api-access-xsgwh\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.766611 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.766673 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-config-data\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.766737 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-scripts\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.766943 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-log-httpd\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.767619 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.767726 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-run-httpd\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.768107 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-run-httpd\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.775632 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-config-data\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.788009 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.790914 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsgwh\" (UniqueName: \"kubernetes.io/projected/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-kube-api-access-xsgwh\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.792230 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.806447 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-scripts\") pod \"ceilometer-0\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.944468 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:15:05 crc kubenswrapper[4941]: I0307 07:15:05.979814 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab97627-6b7f-4984-af5f-a732f13b2486" path="/var/lib/kubelet/pods/2ab97627-6b7f-4984-af5f-a732f13b2486/volumes" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.008789 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.021852 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.076739 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-swift-storage-0\") pod \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.076897 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-config\") pod \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.076930 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-ovsdbserver-sb\") pod \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.076998 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfhjb\" (UniqueName: \"kubernetes.io/projected/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-kube-api-access-hfhjb\") pod \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.077020 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-ovsdbserver-nb\") pod \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.077079 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-svc\") pod \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.086340 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-kube-api-access-hfhjb" (OuterVolumeSpecName: "kube-api-access-hfhjb") pod "ce3f0f8f-0b63-4775-a794-adb5f51cfe66" (UID: "ce3f0f8f-0b63-4775-a794-adb5f51cfe66"). InnerVolumeSpecName "kube-api-access-hfhjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.137741 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce3f0f8f-0b63-4775-a794-adb5f51cfe66" (UID: "ce3f0f8f-0b63-4775-a794-adb5f51cfe66"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.150372 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-config" (OuterVolumeSpecName: "config") pod "ce3f0f8f-0b63-4775-a794-adb5f51cfe66" (UID: "ce3f0f8f-0b63-4775-a794-adb5f51cfe66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.176999 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce3f0f8f-0b63-4775-a794-adb5f51cfe66" (UID: "ce3f0f8f-0b63-4775-a794-adb5f51cfe66"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.177656 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8d0954e7-1af0-4363-9fd4-8713146c0bc6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.177767 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8d0954e7-1af0-4363-9fd4-8713146c0bc6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.178379 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce3f0f8f-0b63-4775-a794-adb5f51cfe66" (UID: "ce3f0f8f-0b63-4775-a794-adb5f51cfe66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.178607 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-svc\") pod \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\" (UID: \"ce3f0f8f-0b63-4775-a794-adb5f51cfe66\") " Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.178696 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-scripts\") pod \"672d89d2-46b4-449f-ad71-2716d50eb2fe\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.178729 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xcvx\" (UniqueName: \"kubernetes.io/projected/672d89d2-46b4-449f-ad71-2716d50eb2fe-kube-api-access-7xcvx\") pod \"672d89d2-46b4-449f-ad71-2716d50eb2fe\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.178799 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-combined-ca-bundle\") pod \"672d89d2-46b4-449f-ad71-2716d50eb2fe\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.178823 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-config-data\") pod \"672d89d2-46b4-449f-ad71-2716d50eb2fe\" (UID: \"672d89d2-46b4-449f-ad71-2716d50eb2fe\") " Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.179231 4941 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.179246 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.179256 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.179264 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfhjb\" (UniqueName: \"kubernetes.io/projected/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-kube-api-access-hfhjb\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:06 crc kubenswrapper[4941]: W0307 07:15:06.179630 4941 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/ce3f0f8f-0b63-4775-a794-adb5f51cfe66/volumes/kubernetes.io~configmap/dns-svc Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.179642 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce3f0f8f-0b63-4775-a794-adb5f51cfe66" (UID: "ce3f0f8f-0b63-4775-a794-adb5f51cfe66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.182597 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce3f0f8f-0b63-4775-a794-adb5f51cfe66" (UID: "ce3f0f8f-0b63-4775-a794-adb5f51cfe66"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.183604 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-scripts" (OuterVolumeSpecName: "scripts") pod "672d89d2-46b4-449f-ad71-2716d50eb2fe" (UID: "672d89d2-46b4-449f-ad71-2716d50eb2fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.184883 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/672d89d2-46b4-449f-ad71-2716d50eb2fe-kube-api-access-7xcvx" (OuterVolumeSpecName: "kube-api-access-7xcvx") pod "672d89d2-46b4-449f-ad71-2716d50eb2fe" (UID: "672d89d2-46b4-449f-ad71-2716d50eb2fe"). InnerVolumeSpecName "kube-api-access-7xcvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.202158 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-config-data" (OuterVolumeSpecName: "config-data") pod "672d89d2-46b4-449f-ad71-2716d50eb2fe" (UID: "672d89d2-46b4-449f-ad71-2716d50eb2fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.217526 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "672d89d2-46b4-449f-ad71-2716d50eb2fe" (UID: "672d89d2-46b4-449f-ad71-2716d50eb2fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.280808 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.280839 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce3f0f8f-0b63-4775-a794-adb5f51cfe66-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.280848 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.280858 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xcvx\" (UniqueName: \"kubernetes.io/projected/672d89d2-46b4-449f-ad71-2716d50eb2fe-kube-api-access-7xcvx\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.280867 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.280877 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/672d89d2-46b4-449f-ad71-2716d50eb2fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.470299 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.525162 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" event={"ID":"ce3f0f8f-0b63-4775-a794-adb5f51cfe66","Type":"ContainerDied","Data":"521b29dcdf0aae20d2f26d535950e34532f6eb437ccb32433c66e3fac28df7ac"} Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.525185 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbf7756bf-kxnb4" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.525218 4941 scope.go:117] "RemoveContainer" containerID="32c5917b694cba85411826ec18fe5346f9c4d4e9441c2f979e5480f971e8e488" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.527235 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gt9jr" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.527205 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gt9jr" event={"ID":"672d89d2-46b4-449f-ad71-2716d50eb2fe","Type":"ContainerDied","Data":"4f5058965f354e843b1ebd82108ea85c1f068002cfa9626bf3e9a03e5cf5d42e"} Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.527635 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f5058965f354e843b1ebd82108ea85c1f068002cfa9626bf3e9a03e5cf5d42e" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.528735 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046","Type":"ContainerStarted","Data":"15b2f9112e2570586f2813aa1e335fef526cc88403c0d98849f31049fa12b4e8"} Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.561463 4941 scope.go:117] "RemoveContainer" containerID="63f5df511d133f8dafff8a42eb1c5eae7f26a1929b39aad4df638313ee7e7378" Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.580550 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cbf7756bf-kxnb4"] Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.581635 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cbf7756bf-kxnb4"] Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.614998 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.615263 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8d0954e7-1af0-4363-9fd4-8713146c0bc6" containerName="nova-api-log" containerID="cri-o://95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe" gracePeriod=30 Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.615469 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8d0954e7-1af0-4363-9fd4-8713146c0bc6" containerName="nova-api-api" containerID="cri-o://a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3" gracePeriod=30 Mar 07 07:15:06 crc kubenswrapper[4941]: I0307 07:15:06.627729 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.030087 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.195596 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-config-data\") pod \"2d028683-343b-490d-9790-202e64e4e721\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.195935 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drv9j\" (UniqueName: \"kubernetes.io/projected/2d028683-343b-490d-9790-202e64e4e721-kube-api-access-drv9j\") pod \"2d028683-343b-490d-9790-202e64e4e721\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.195970 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-combined-ca-bundle\") pod \"2d028683-343b-490d-9790-202e64e4e721\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.196000 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-scripts\") pod \"2d028683-343b-490d-9790-202e64e4e721\" (UID: \"2d028683-343b-490d-9790-202e64e4e721\") " Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.199539 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-scripts" (OuterVolumeSpecName: "scripts") pod "2d028683-343b-490d-9790-202e64e4e721" (UID: "2d028683-343b-490d-9790-202e64e4e721"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.200020 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d028683-343b-490d-9790-202e64e4e721-kube-api-access-drv9j" (OuterVolumeSpecName: "kube-api-access-drv9j") pod "2d028683-343b-490d-9790-202e64e4e721" (UID: "2d028683-343b-490d-9790-202e64e4e721"). InnerVolumeSpecName "kube-api-access-drv9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.228219 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d028683-343b-490d-9790-202e64e4e721" (UID: "2d028683-343b-490d-9790-202e64e4e721"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.230863 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-config-data" (OuterVolumeSpecName: "config-data") pod "2d028683-343b-490d-9790-202e64e4e721" (UID: "2d028683-343b-490d-9790-202e64e4e721"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.298800 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.298840 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drv9j\" (UniqueName: \"kubernetes.io/projected/2d028683-343b-490d-9790-202e64e4e721-kube-api-access-drv9j\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.298854 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.298866 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d028683-343b-490d-9790-202e64e4e721-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.567038 4941 generic.go:334] "Generic (PLEG): container finished" podID="8d0954e7-1af0-4363-9fd4-8713146c0bc6" containerID="95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe" exitCode=143 Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.567105 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d0954e7-1af0-4363-9fd4-8713146c0bc6","Type":"ContainerDied","Data":"95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe"} Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.572135 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n4cfj" event={"ID":"2d028683-343b-490d-9790-202e64e4e721","Type":"ContainerDied","Data":"92914de9a5d5f0b8e9ad416f125ec49d2055a4f6cd61f557001b37c6e882485b"} Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.572175 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92914de9a5d5f0b8e9ad416f125ec49d2055a4f6cd61f557001b37c6e882485b" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.572240 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n4cfj" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.577248 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046","Type":"ContainerStarted","Data":"e95d284d1897da3150166394fed85abe100a84be0cf4a78984cff4e5dda880f9"} Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.577369 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="74e3be4e-1078-4655-bd5f-f5a7c6550256" containerName="nova-scheduler-scheduler" containerID="cri-o://6f3804b8a6c46d0c9e05448a2f6a500a22186213382946e0cc47cd06b5c37fbe" gracePeriod=30 Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.588128 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 07 07:15:07 crc kubenswrapper[4941]: E0307 07:15:07.588635 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="672d89d2-46b4-449f-ad71-2716d50eb2fe" containerName="nova-manage" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.588658 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="672d89d2-46b4-449f-ad71-2716d50eb2fe" containerName="nova-manage" Mar 07 07:15:07 crc kubenswrapper[4941]: E0307 07:15:07.588677 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3f0f8f-0b63-4775-a794-adb5f51cfe66" containerName="dnsmasq-dns" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.588686 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3f0f8f-0b63-4775-a794-adb5f51cfe66" containerName="dnsmasq-dns" Mar 07 07:15:07 crc kubenswrapper[4941]: E0307 07:15:07.588714 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d028683-343b-490d-9790-202e64e4e721" containerName="nova-cell1-conductor-db-sync" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.588723 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d028683-343b-490d-9790-202e64e4e721" containerName="nova-cell1-conductor-db-sync" Mar 07 07:15:07 crc kubenswrapper[4941]: E0307 07:15:07.588750 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3f0f8f-0b63-4775-a794-adb5f51cfe66" containerName="init" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.588759 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3f0f8f-0b63-4775-a794-adb5f51cfe66" containerName="init" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.589102 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce3f0f8f-0b63-4775-a794-adb5f51cfe66" containerName="dnsmasq-dns" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.589133 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="672d89d2-46b4-449f-ad71-2716d50eb2fe" containerName="nova-manage" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.589147 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d028683-343b-490d-9790-202e64e4e721" containerName="nova-cell1-conductor-db-sync" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.590118 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.595538 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.606619 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.708705 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d63f3-758d-4884-8086-93defd44f58a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"943d63f3-758d-4884-8086-93defd44f58a\") " pod="openstack/nova-cell1-conductor-0" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.708758 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtbwv\" (UniqueName: \"kubernetes.io/projected/943d63f3-758d-4884-8086-93defd44f58a-kube-api-access-rtbwv\") pod \"nova-cell1-conductor-0\" (UID: \"943d63f3-758d-4884-8086-93defd44f58a\") " pod="openstack/nova-cell1-conductor-0" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.708809 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943d63f3-758d-4884-8086-93defd44f58a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"943d63f3-758d-4884-8086-93defd44f58a\") " pod="openstack/nova-cell1-conductor-0" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.810682 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d63f3-758d-4884-8086-93defd44f58a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"943d63f3-758d-4884-8086-93defd44f58a\") " pod="openstack/nova-cell1-conductor-0" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.810780 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtbwv\" (UniqueName: \"kubernetes.io/projected/943d63f3-758d-4884-8086-93defd44f58a-kube-api-access-rtbwv\") pod \"nova-cell1-conductor-0\" (UID: \"943d63f3-758d-4884-8086-93defd44f58a\") " pod="openstack/nova-cell1-conductor-0" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.810878 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943d63f3-758d-4884-8086-93defd44f58a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"943d63f3-758d-4884-8086-93defd44f58a\") " pod="openstack/nova-cell1-conductor-0" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.815344 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943d63f3-758d-4884-8086-93defd44f58a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"943d63f3-758d-4884-8086-93defd44f58a\") " pod="openstack/nova-cell1-conductor-0" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.815489 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d63f3-758d-4884-8086-93defd44f58a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"943d63f3-758d-4884-8086-93defd44f58a\") " pod="openstack/nova-cell1-conductor-0" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.827478 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtbwv\" (UniqueName: \"kubernetes.io/projected/943d63f3-758d-4884-8086-93defd44f58a-kube-api-access-rtbwv\") pod \"nova-cell1-conductor-0\" (UID: \"943d63f3-758d-4884-8086-93defd44f58a\") " pod="openstack/nova-cell1-conductor-0" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.922890 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 07 07:15:07 crc kubenswrapper[4941]: I0307 07:15:07.968061 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3f0f8f-0b63-4775-a794-adb5f51cfe66" path="/var/lib/kubelet/pods/ce3f0f8f-0b63-4775-a794-adb5f51cfe66/volumes" Mar 07 07:15:08 crc kubenswrapper[4941]: I0307 07:15:08.400921 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 07 07:15:08 crc kubenswrapper[4941]: I0307 07:15:08.589314 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"943d63f3-758d-4884-8086-93defd44f58a","Type":"ContainerStarted","Data":"272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97"} Mar 07 07:15:08 crc kubenswrapper[4941]: I0307 07:15:08.589383 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"943d63f3-758d-4884-8086-93defd44f58a","Type":"ContainerStarted","Data":"4695c817ce463aa48ad28a254de7a04f00fe7c0d74c1bf91c8e5ef6ead291855"} Mar 07 07:15:08 crc kubenswrapper[4941]: I0307 07:15:08.589450 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 07 07:15:08 crc kubenswrapper[4941]: I0307 07:15:08.592213 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046","Type":"ContainerStarted","Data":"1129d89f81b40b4d80c583a41cfcad8c875d8ecf9fd33d49b88c5c50f4cb4147"} Mar 07 07:15:08 crc kubenswrapper[4941]: I0307 07:15:08.592259 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046","Type":"ContainerStarted","Data":"11566e65a4e20367dbc4e401b2b98d6fce48d8fcaf9c840c54728f3b24d8e648"} Mar 07 07:15:08 crc kubenswrapper[4941]: I0307 07:15:08.618271 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.618249794 podStartE2EDuration="1.618249794s" podCreationTimestamp="2026-03-07 07:15:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:15:08.614790217 +0000 UTC m=+1405.567155682" watchObservedRunningTime="2026-03-07 07:15:08.618249794 +0000 UTC m=+1405.570615259" Mar 07 07:15:10 crc kubenswrapper[4941]: E0307 07:15:10.126045 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6f3804b8a6c46d0c9e05448a2f6a500a22186213382946e0cc47cd06b5c37fbe" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 07:15:10 crc kubenswrapper[4941]: E0307 07:15:10.129787 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6f3804b8a6c46d0c9e05448a2f6a500a22186213382946e0cc47cd06b5c37fbe" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 07:15:10 crc kubenswrapper[4941]: E0307 07:15:10.132165 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6f3804b8a6c46d0c9e05448a2f6a500a22186213382946e0cc47cd06b5c37fbe" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 07:15:10 crc kubenswrapper[4941]: E0307 07:15:10.132205 4941 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="74e3be4e-1078-4655-bd5f-f5a7c6550256" containerName="nova-scheduler-scheduler" Mar 07 07:15:10 crc kubenswrapper[4941]: I0307 07:15:10.313658 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:15:10 crc kubenswrapper[4941]: I0307 07:15:10.313727 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:15:10 crc kubenswrapper[4941]: I0307 07:15:10.313784 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 07:15:10 crc kubenswrapper[4941]: I0307 07:15:10.314517 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"275b7664a9752e3935f384cd42fa92a626ebe7af03267d645869fc5e152276f5"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:15:10 crc kubenswrapper[4941]: I0307 07:15:10.314571 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://275b7664a9752e3935f384cd42fa92a626ebe7af03267d645869fc5e152276f5" gracePeriod=600 Mar 07 07:15:10 crc kubenswrapper[4941]: I0307 07:15:10.618562 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="275b7664a9752e3935f384cd42fa92a626ebe7af03267d645869fc5e152276f5" exitCode=0 Mar 07 07:15:10 crc kubenswrapper[4941]: I0307 07:15:10.618653 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"275b7664a9752e3935f384cd42fa92a626ebe7af03267d645869fc5e152276f5"} Mar 07 07:15:10 crc kubenswrapper[4941]: I0307 07:15:10.618708 4941 scope.go:117] "RemoveContainer" containerID="8b286093b0ba04c8db409f2f8003244d432459a7a31a64eb7ee6e534880ca523" Mar 07 07:15:10 crc kubenswrapper[4941]: I0307 07:15:10.625779 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046","Type":"ContainerStarted","Data":"1e6a51c773f67d6c421a2aa2916d5fef7dc6722f7f17013720f33cfe20edd934"} Mar 07 07:15:10 crc kubenswrapper[4941]: I0307 07:15:10.626056 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 07 07:15:10 crc kubenswrapper[4941]: I0307 07:15:10.653180 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.332098355 podStartE2EDuration="5.653154074s" podCreationTimestamp="2026-03-07 07:15:05 +0000 UTC" firstStartedPulling="2026-03-07 07:15:06.471633204 +0000 UTC m=+1403.423998669" lastFinishedPulling="2026-03-07 07:15:09.792688903 +0000 UTC m=+1406.745054388" observedRunningTime="2026-03-07 07:15:10.652001555 +0000 UTC m=+1407.604367020" watchObservedRunningTime="2026-03-07 07:15:10.653154074 +0000 UTC m=+1407.605519549" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.537385 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.592639 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0954e7-1af0-4363-9fd4-8713146c0bc6-combined-ca-bundle\") pod \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.592739 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d0954e7-1af0-4363-9fd4-8713146c0bc6-logs\") pod \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.592826 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hct2z\" (UniqueName: \"kubernetes.io/projected/8d0954e7-1af0-4363-9fd4-8713146c0bc6-kube-api-access-hct2z\") pod \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.592905 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0954e7-1af0-4363-9fd4-8713146c0bc6-config-data\") pod \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\" (UID: \"8d0954e7-1af0-4363-9fd4-8713146c0bc6\") " Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.593690 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d0954e7-1af0-4363-9fd4-8713146c0bc6-logs" (OuterVolumeSpecName: "logs") pod "8d0954e7-1af0-4363-9fd4-8713146c0bc6" (UID: "8d0954e7-1af0-4363-9fd4-8713146c0bc6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.612692 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d0954e7-1af0-4363-9fd4-8713146c0bc6-kube-api-access-hct2z" (OuterVolumeSpecName: "kube-api-access-hct2z") pod "8d0954e7-1af0-4363-9fd4-8713146c0bc6" (UID: "8d0954e7-1af0-4363-9fd4-8713146c0bc6"). InnerVolumeSpecName "kube-api-access-hct2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.623136 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0954e7-1af0-4363-9fd4-8713146c0bc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d0954e7-1af0-4363-9fd4-8713146c0bc6" (UID: "8d0954e7-1af0-4363-9fd4-8713146c0bc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.644065 4941 generic.go:334] "Generic (PLEG): container finished" podID="74e3be4e-1078-4655-bd5f-f5a7c6550256" containerID="6f3804b8a6c46d0c9e05448a2f6a500a22186213382946e0cc47cd06b5c37fbe" exitCode=0 Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.644130 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"74e3be4e-1078-4655-bd5f-f5a7c6550256","Type":"ContainerDied","Data":"6f3804b8a6c46d0c9e05448a2f6a500a22186213382946e0cc47cd06b5c37fbe"} Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.648424 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97"} Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.650607 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0954e7-1af0-4363-9fd4-8713146c0bc6-config-data" (OuterVolumeSpecName: "config-data") pod "8d0954e7-1af0-4363-9fd4-8713146c0bc6" (UID: "8d0954e7-1af0-4363-9fd4-8713146c0bc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.653120 4941 generic.go:334] "Generic (PLEG): container finished" podID="8d0954e7-1af0-4363-9fd4-8713146c0bc6" containerID="a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3" exitCode=0 Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.653799 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.654590 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d0954e7-1af0-4363-9fd4-8713146c0bc6","Type":"ContainerDied","Data":"a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3"} Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.654676 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d0954e7-1af0-4363-9fd4-8713146c0bc6","Type":"ContainerDied","Data":"d7cee575f710d6240b13792cf55e9dca24d976c6d9c352cbce3782b1981bb01d"} Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.654701 4941 scope.go:117] "RemoveContainer" containerID="a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.695794 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hct2z\" (UniqueName: \"kubernetes.io/projected/8d0954e7-1af0-4363-9fd4-8713146c0bc6-kube-api-access-hct2z\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.695821 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d0954e7-1af0-4363-9fd4-8713146c0bc6-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.695841 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d0954e7-1af0-4363-9fd4-8713146c0bc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.695850 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d0954e7-1af0-4363-9fd4-8713146c0bc6-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.735662 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.744492 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.754268 4941 scope.go:117] "RemoveContainer" containerID="95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.802598 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:11 crc kubenswrapper[4941]: E0307 07:15:11.803434 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d0954e7-1af0-4363-9fd4-8713146c0bc6" containerName="nova-api-api" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.803463 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0954e7-1af0-4363-9fd4-8713146c0bc6" containerName="nova-api-api" Mar 07 07:15:11 crc kubenswrapper[4941]: E0307 07:15:11.803485 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d0954e7-1af0-4363-9fd4-8713146c0bc6" containerName="nova-api-log" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.803494 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0954e7-1af0-4363-9fd4-8713146c0bc6" containerName="nova-api-log" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.803933 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d0954e7-1af0-4363-9fd4-8713146c0bc6" containerName="nova-api-api" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.803990 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d0954e7-1af0-4363-9fd4-8713146c0bc6" containerName="nova-api-log" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.805784 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.813148 4941 scope.go:117] "RemoveContainer" containerID="a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.814248 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 07 07:15:11 crc kubenswrapper[4941]: E0307 07:15:11.814891 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3\": container with ID starting with a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3 not found: ID does not exist" containerID="a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.814995 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3"} err="failed to get container status \"a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3\": rpc error: code = NotFound desc = could not find container \"a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3\": container with ID starting with a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3 not found: ID does not exist" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.815045 4941 scope.go:117] "RemoveContainer" containerID="95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe" Mar 07 07:15:11 crc kubenswrapper[4941]: E0307 07:15:11.817142 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe\": container with ID starting with 95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe not found: ID does not exist" containerID="95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.817203 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe"} err="failed to get container status \"95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe\": rpc error: code = NotFound desc = could not find container \"95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe\": container with ID starting with 95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe not found: ID does not exist" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.821036 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.838934 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.904092 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e3be4e-1078-4655-bd5f-f5a7c6550256-combined-ca-bundle\") pod \"74e3be4e-1078-4655-bd5f-f5a7c6550256\" (UID: \"74e3be4e-1078-4655-bd5f-f5a7c6550256\") " Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.904414 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e3be4e-1078-4655-bd5f-f5a7c6550256-config-data\") pod \"74e3be4e-1078-4655-bd5f-f5a7c6550256\" (UID: \"74e3be4e-1078-4655-bd5f-f5a7c6550256\") " Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.904499 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtq7d\" (UniqueName: \"kubernetes.io/projected/74e3be4e-1078-4655-bd5f-f5a7c6550256-kube-api-access-mtq7d\") pod \"74e3be4e-1078-4655-bd5f-f5a7c6550256\" (UID: \"74e3be4e-1078-4655-bd5f-f5a7c6550256\") " Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.905166 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cd8a66f-ad94-48d8-adc6-e71e962db352-logs\") pod \"nova-api-0\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " pod="openstack/nova-api-0" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.905266 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsmh6\" (UniqueName: \"kubernetes.io/projected/4cd8a66f-ad94-48d8-adc6-e71e962db352-kube-api-access-lsmh6\") pod \"nova-api-0\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " pod="openstack/nova-api-0" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.905317 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd8a66f-ad94-48d8-adc6-e71e962db352-config-data\") pod \"nova-api-0\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " pod="openstack/nova-api-0" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.905333 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd8a66f-ad94-48d8-adc6-e71e962db352-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " pod="openstack/nova-api-0" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.907729 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e3be4e-1078-4655-bd5f-f5a7c6550256-kube-api-access-mtq7d" (OuterVolumeSpecName: "kube-api-access-mtq7d") pod "74e3be4e-1078-4655-bd5f-f5a7c6550256" (UID: "74e3be4e-1078-4655-bd5f-f5a7c6550256"). InnerVolumeSpecName "kube-api-access-mtq7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.928582 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e3be4e-1078-4655-bd5f-f5a7c6550256-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74e3be4e-1078-4655-bd5f-f5a7c6550256" (UID: "74e3be4e-1078-4655-bd5f-f5a7c6550256"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.930387 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e3be4e-1078-4655-bd5f-f5a7c6550256-config-data" (OuterVolumeSpecName: "config-data") pod "74e3be4e-1078-4655-bd5f-f5a7c6550256" (UID: "74e3be4e-1078-4655-bd5f-f5a7c6550256"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:11 crc kubenswrapper[4941]: I0307 07:15:11.966196 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d0954e7-1af0-4363-9fd4-8713146c0bc6" path="/var/lib/kubelet/pods/8d0954e7-1af0-4363-9fd4-8713146c0bc6/volumes" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.007630 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsmh6\" (UniqueName: \"kubernetes.io/projected/4cd8a66f-ad94-48d8-adc6-e71e962db352-kube-api-access-lsmh6\") pod \"nova-api-0\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " pod="openstack/nova-api-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.007984 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd8a66f-ad94-48d8-adc6-e71e962db352-config-data\") pod \"nova-api-0\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " pod="openstack/nova-api-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.008016 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd8a66f-ad94-48d8-adc6-e71e962db352-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " pod="openstack/nova-api-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.008173 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cd8a66f-ad94-48d8-adc6-e71e962db352-logs\") pod \"nova-api-0\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " pod="openstack/nova-api-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.008265 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e3be4e-1078-4655-bd5f-f5a7c6550256-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.008288 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtq7d\" (UniqueName: \"kubernetes.io/projected/74e3be4e-1078-4655-bd5f-f5a7c6550256-kube-api-access-mtq7d\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.008300 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e3be4e-1078-4655-bd5f-f5a7c6550256-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.008947 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cd8a66f-ad94-48d8-adc6-e71e962db352-logs\") pod \"nova-api-0\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " pod="openstack/nova-api-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.015884 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd8a66f-ad94-48d8-adc6-e71e962db352-config-data\") pod \"nova-api-0\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " pod="openstack/nova-api-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.016854 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd8a66f-ad94-48d8-adc6-e71e962db352-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " pod="openstack/nova-api-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.023369 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsmh6\" (UniqueName: \"kubernetes.io/projected/4cd8a66f-ad94-48d8-adc6-e71e962db352-kube-api-access-lsmh6\") pod \"nova-api-0\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " pod="openstack/nova-api-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.137092 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.602586 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.674099 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cd8a66f-ad94-48d8-adc6-e71e962db352","Type":"ContainerStarted","Data":"451fa123a8125c075e8f8ecb1395c3c1ed5b9df4559b31c289a4eb24656f1d44"} Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.687435 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.687422 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"74e3be4e-1078-4655-bd5f-f5a7c6550256","Type":"ContainerDied","Data":"850ff261d488ad60c7f02beae257ccfb3e65161de267906209b49038db815495"} Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.687503 4941 scope.go:117] "RemoveContainer" containerID="6f3804b8a6c46d0c9e05448a2f6a500a22186213382946e0cc47cd06b5c37fbe" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.754750 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.771843 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.785191 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:15:12 crc kubenswrapper[4941]: E0307 07:15:12.785704 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e3be4e-1078-4655-bd5f-f5a7c6550256" containerName="nova-scheduler-scheduler" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.785725 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e3be4e-1078-4655-bd5f-f5a7c6550256" containerName="nova-scheduler-scheduler" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.785931 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e3be4e-1078-4655-bd5f-f5a7c6550256" containerName="nova-scheduler-scheduler" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.786572 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.789192 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.815768 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.822956 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b46cb5-1e5f-49ac-9852-ebb562330737-config-data\") pod \"nova-scheduler-0\" (UID: \"04b46cb5-1e5f-49ac-9852-ebb562330737\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.823028 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfcfz\" (UniqueName: \"kubernetes.io/projected/04b46cb5-1e5f-49ac-9852-ebb562330737-kube-api-access-tfcfz\") pod \"nova-scheduler-0\" (UID: \"04b46cb5-1e5f-49ac-9852-ebb562330737\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.823152 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b46cb5-1e5f-49ac-9852-ebb562330737-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04b46cb5-1e5f-49ac-9852-ebb562330737\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.925241 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b46cb5-1e5f-49ac-9852-ebb562330737-config-data\") pod \"nova-scheduler-0\" (UID: \"04b46cb5-1e5f-49ac-9852-ebb562330737\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.925301 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfcfz\" (UniqueName: \"kubernetes.io/projected/04b46cb5-1e5f-49ac-9852-ebb562330737-kube-api-access-tfcfz\") pod \"nova-scheduler-0\" (UID: \"04b46cb5-1e5f-49ac-9852-ebb562330737\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.925366 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b46cb5-1e5f-49ac-9852-ebb562330737-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04b46cb5-1e5f-49ac-9852-ebb562330737\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.929888 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b46cb5-1e5f-49ac-9852-ebb562330737-config-data\") pod \"nova-scheduler-0\" (UID: \"04b46cb5-1e5f-49ac-9852-ebb562330737\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.930103 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b46cb5-1e5f-49ac-9852-ebb562330737-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04b46cb5-1e5f-49ac-9852-ebb562330737\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:12 crc kubenswrapper[4941]: I0307 07:15:12.944297 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfcfz\" (UniqueName: \"kubernetes.io/projected/04b46cb5-1e5f-49ac-9852-ebb562330737-kube-api-access-tfcfz\") pod \"nova-scheduler-0\" (UID: \"04b46cb5-1e5f-49ac-9852-ebb562330737\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:13 crc kubenswrapper[4941]: I0307 07:15:13.111796 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 07:15:13 crc kubenswrapper[4941]: I0307 07:15:13.655990 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:15:13 crc kubenswrapper[4941]: I0307 07:15:13.702943 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cd8a66f-ad94-48d8-adc6-e71e962db352","Type":"ContainerStarted","Data":"c42bfc87bb5fb1a2d9862a61bbd0ce45dc6e2c2fcb10ff527c61fc228dfc8ddb"} Mar 07 07:15:13 crc kubenswrapper[4941]: I0307 07:15:13.703006 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cd8a66f-ad94-48d8-adc6-e71e962db352","Type":"ContainerStarted","Data":"c679ce65c7e6eb73c460b500d1f032651e34e86afd48d7f5e434e3fe618a13b6"} Mar 07 07:15:13 crc kubenswrapper[4941]: I0307 07:15:13.705286 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04b46cb5-1e5f-49ac-9852-ebb562330737","Type":"ContainerStarted","Data":"018406604e00a3ab34b3d05cf12f923b337ec47ec05a92229861873150941da8"} Mar 07 07:15:13 crc kubenswrapper[4941]: I0307 07:15:13.728005 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.727990869 podStartE2EDuration="2.727990869s" podCreationTimestamp="2026-03-07 07:15:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:15:13.718098859 +0000 UTC m=+1410.670464334" watchObservedRunningTime="2026-03-07 07:15:13.727990869 +0000 UTC m=+1410.680356334" Mar 07 07:15:13 crc kubenswrapper[4941]: I0307 07:15:13.982430 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e3be4e-1078-4655-bd5f-f5a7c6550256" path="/var/lib/kubelet/pods/74e3be4e-1078-4655-bd5f-f5a7c6550256/volumes" Mar 07 07:15:14 crc kubenswrapper[4941]: I0307 07:15:14.752558 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04b46cb5-1e5f-49ac-9852-ebb562330737","Type":"ContainerStarted","Data":"831214eaf89a9fcbe1ed8c8feb08de8789226d181d7192eb68362e3d028092ff"} Mar 07 07:15:14 crc kubenswrapper[4941]: I0307 07:15:14.775105 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.775080602 podStartE2EDuration="2.775080602s" podCreationTimestamp="2026-03-07 07:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:15:14.768776742 +0000 UTC m=+1411.721142217" watchObservedRunningTime="2026-03-07 07:15:14.775080602 +0000 UTC m=+1411.727446057" Mar 07 07:15:17 crc kubenswrapper[4941]: I0307 07:15:17.982536 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 07 07:15:18 crc kubenswrapper[4941]: I0307 07:15:18.112737 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 07 07:15:22 crc kubenswrapper[4941]: I0307 07:15:22.138091 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 07:15:22 crc kubenswrapper[4941]: I0307 07:15:22.138915 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 07:15:23 crc kubenswrapper[4941]: I0307 07:15:23.112772 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 07 07:15:23 crc kubenswrapper[4941]: I0307 07:15:23.149831 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 07 07:15:23 crc kubenswrapper[4941]: I0307 07:15:23.220595 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4cd8a66f-ad94-48d8-adc6-e71e962db352" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 07 07:15:23 crc kubenswrapper[4941]: I0307 07:15:23.220652 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4cd8a66f-ad94-48d8-adc6-e71e962db352" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 07 07:15:23 crc kubenswrapper[4941]: I0307 07:15:23.886528 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 07 07:15:30 crc kubenswrapper[4941]: W0307 07:15:30.427135 4941 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74e3be4e_1078_4655_bd5f_f5a7c6550256.slice/crio-6f3804b8a6c46d0c9e05448a2f6a500a22186213382946e0cc47cd06b5c37fbe.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74e3be4e_1078_4655_bd5f_f5a7c6550256.slice/crio-6f3804b8a6c46d0c9e05448a2f6a500a22186213382946e0cc47cd06b5c37fbe.scope: no such file or directory Mar 07 07:15:30 crc kubenswrapper[4941]: W0307 07:15:30.427947 4941 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7b0380d_e6d2_473f_a49b_bdccb4747ccc.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7b0380d_e6d2_473f_a49b_bdccb4747ccc.slice: no such file or directory Mar 07 07:15:30 crc kubenswrapper[4941]: W0307 07:15:30.427989 4941 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d0954e7_1af0_4363_9fd4_8713146c0bc6.slice/crio-conmon-a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d0954e7_1af0_4363_9fd4_8713146c0bc6.slice/crio-conmon-a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3.scope: no such file or directory Mar 07 07:15:30 crc kubenswrapper[4941]: W0307 07:15:30.428019 4941 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6712cc3f_2e68_4522_8624_84903da6d19d.slice/crio-conmon-9f3fb6ba0495b23cf3bd3f4a5cbc0981199748f055976e7d3a133030731c406e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6712cc3f_2e68_4522_8624_84903da6d19d.slice/crio-conmon-9f3fb6ba0495b23cf3bd3f4a5cbc0981199748f055976e7d3a133030731c406e.scope: no such file or directory Mar 07 07:15:30 crc kubenswrapper[4941]: W0307 07:15:30.428062 4941 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d0954e7_1af0_4363_9fd4_8713146c0bc6.slice/crio-a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d0954e7_1af0_4363_9fd4_8713146c0bc6.slice/crio-a3bfcd8436d86c910e92aa664186fdb131a0ff5fc697fb12796bca3c7ddee6e3.scope: no such file or directory Mar 07 07:15:30 crc kubenswrapper[4941]: W0307 07:15:30.428100 4941 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6712cc3f_2e68_4522_8624_84903da6d19d.slice/crio-9f3fb6ba0495b23cf3bd3f4a5cbc0981199748f055976e7d3a133030731c406e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6712cc3f_2e68_4522_8624_84903da6d19d.slice/crio-9f3fb6ba0495b23cf3bd3f4a5cbc0981199748f055976e7d3a133030731c406e.scope: no such file or directory Mar 07 07:15:30 crc kubenswrapper[4941]: W0307 07:15:30.532942 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d0954e7_1af0_4363_9fd4_8713146c0bc6.slice/crio-95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe.scope WatchSource:0}: Error finding container 95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe: Status 404 returned error can't find the container with id 95b5651120783a369200c1747b7fefd6258863c620adfb5a3891dc9f5c6a73fe Mar 07 07:15:30 crc kubenswrapper[4941]: E0307 07:15:30.680993 4941 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250d2c0d_993b_466a_a5e0_bacae5fe8df5.slice/crio-275b7664a9752e3935f384cd42fa92a626ebe7af03267d645869fc5e152276f5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod672d89d2_46b4_449f_ad71_2716d50eb2fe.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ab97627_6b7f_4984_af5f_a732f13b2486.slice/crio-conmon-c58e5b9c34ddbe30187350ca2343ceb9e751aeabfefecaacc487d2225fa6920e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250d2c0d_993b_466a_a5e0_bacae5fe8df5.slice/crio-conmon-275b7664a9752e3935f384cd42fa92a626ebe7af03267d645869fc5e152276f5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d0954e7_1af0_4363_9fd4_8713146c0bc6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3f0f8f_0b63_4775_a794_adb5f51cfe66.slice/crio-521b29dcdf0aae20d2f26d535950e34532f6eb437ccb32433c66e3fac28df7ac\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3f0f8f_0b63_4775_a794_adb5f51cfe66.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3f0f8f_0b63_4775_a794_adb5f51cfe66.slice/crio-32c5917b694cba85411826ec18fe5346f9c4d4e9441c2f979e5480f971e8e488.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ab97627_6b7f_4984_af5f_a732f13b2486.slice/crio-accece7c95c66a08119647336a9c3a2bf1fa2a3706527a1b2e0a8b33c272492d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ab97627_6b7f_4984_af5f_a732f13b2486.slice/crio-c58e5b9c34ddbe30187350ca2343ceb9e751aeabfefecaacc487d2225fa6920e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ab97627_6b7f_4984_af5f_a732f13b2486.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3f0f8f_0b63_4775_a794_adb5f51cfe66.slice/crio-conmon-32c5917b694cba85411826ec18fe5346f9c4d4e9441c2f979e5480f971e8e488.scope\": RecentStats: unable to find data in memory cache]" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.416106 4941 generic.go:334] "Generic (PLEG): container finished" podID="6712cc3f-2e68-4522-8624-84903da6d19d" containerID="9f3fb6ba0495b23cf3bd3f4a5cbc0981199748f055976e7d3a133030731c406e" exitCode=137 Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.416173 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6712cc3f-2e68-4522-8624-84903da6d19d","Type":"ContainerDied","Data":"9f3fb6ba0495b23cf3bd3f4a5cbc0981199748f055976e7d3a133030731c406e"} Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.416545 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6712cc3f-2e68-4522-8624-84903da6d19d","Type":"ContainerDied","Data":"8abc501831e1c09dad8a5ba8bb62774660563f5b0122251592f1276911ff2bcf"} Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.416580 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8abc501831e1c09dad8a5ba8bb62774660563f5b0122251592f1276911ff2bcf" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.417755 4941 generic.go:334] "Generic (PLEG): container finished" podID="a6edec94-977f-493a-97b6-f90e02d07467" containerID="4538e4358b9525e75783e8afd80be20d7156046fb83eecb859f5b485afa2078b" exitCode=137 Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.417786 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a6edec94-977f-493a-97b6-f90e02d07467","Type":"ContainerDied","Data":"4538e4358b9525e75783e8afd80be20d7156046fb83eecb859f5b485afa2078b"} Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.451500 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.455488 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.593830 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6712cc3f-2e68-4522-8624-84903da6d19d-combined-ca-bundle\") pod \"6712cc3f-2e68-4522-8624-84903da6d19d\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.594823 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqcp6\" (UniqueName: \"kubernetes.io/projected/a6edec94-977f-493a-97b6-f90e02d07467-kube-api-access-zqcp6\") pod \"a6edec94-977f-493a-97b6-f90e02d07467\" (UID: \"a6edec94-977f-493a-97b6-f90e02d07467\") " Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.594920 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6edec94-977f-493a-97b6-f90e02d07467-config-data\") pod \"a6edec94-977f-493a-97b6-f90e02d07467\" (UID: \"a6edec94-977f-493a-97b6-f90e02d07467\") " Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.594973 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6712cc3f-2e68-4522-8624-84903da6d19d-config-data\") pod \"6712cc3f-2e68-4522-8624-84903da6d19d\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.595764 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgtl7\" (UniqueName: \"kubernetes.io/projected/6712cc3f-2e68-4522-8624-84903da6d19d-kube-api-access-qgtl7\") pod \"6712cc3f-2e68-4522-8624-84903da6d19d\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.595845 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6712cc3f-2e68-4522-8624-84903da6d19d-logs\") pod \"6712cc3f-2e68-4522-8624-84903da6d19d\" (UID: \"6712cc3f-2e68-4522-8624-84903da6d19d\") " Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.595953 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6edec94-977f-493a-97b6-f90e02d07467-combined-ca-bundle\") pod \"a6edec94-977f-493a-97b6-f90e02d07467\" (UID: \"a6edec94-977f-493a-97b6-f90e02d07467\") " Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.596578 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6712cc3f-2e68-4522-8624-84903da6d19d-logs" (OuterVolumeSpecName: "logs") pod "6712cc3f-2e68-4522-8624-84903da6d19d" (UID: "6712cc3f-2e68-4522-8624-84903da6d19d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.597711 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6712cc3f-2e68-4522-8624-84903da6d19d-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.601199 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6edec94-977f-493a-97b6-f90e02d07467-kube-api-access-zqcp6" (OuterVolumeSpecName: "kube-api-access-zqcp6") pod "a6edec94-977f-493a-97b6-f90e02d07467" (UID: "a6edec94-977f-493a-97b6-f90e02d07467"). InnerVolumeSpecName "kube-api-access-zqcp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.621541 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6712cc3f-2e68-4522-8624-84903da6d19d-kube-api-access-qgtl7" (OuterVolumeSpecName: "kube-api-access-qgtl7") pod "6712cc3f-2e68-4522-8624-84903da6d19d" (UID: "6712cc3f-2e68-4522-8624-84903da6d19d"). InnerVolumeSpecName "kube-api-access-qgtl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.629904 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6712cc3f-2e68-4522-8624-84903da6d19d-config-data" (OuterVolumeSpecName: "config-data") pod "6712cc3f-2e68-4522-8624-84903da6d19d" (UID: "6712cc3f-2e68-4522-8624-84903da6d19d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.633131 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6edec94-977f-493a-97b6-f90e02d07467-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6edec94-977f-493a-97b6-f90e02d07467" (UID: "a6edec94-977f-493a-97b6-f90e02d07467"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.635004 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6edec94-977f-493a-97b6-f90e02d07467-config-data" (OuterVolumeSpecName: "config-data") pod "a6edec94-977f-493a-97b6-f90e02d07467" (UID: "a6edec94-977f-493a-97b6-f90e02d07467"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.644395 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6712cc3f-2e68-4522-8624-84903da6d19d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6712cc3f-2e68-4522-8624-84903da6d19d" (UID: "6712cc3f-2e68-4522-8624-84903da6d19d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.699533 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6edec94-977f-493a-97b6-f90e02d07467-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.699801 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6712cc3f-2e68-4522-8624-84903da6d19d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.699890 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqcp6\" (UniqueName: \"kubernetes.io/projected/a6edec94-977f-493a-97b6-f90e02d07467-kube-api-access-zqcp6\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.699992 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6edec94-977f-493a-97b6-f90e02d07467-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.700079 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6712cc3f-2e68-4522-8624-84903da6d19d-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:31 crc kubenswrapper[4941]: I0307 07:15:31.700163 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgtl7\" (UniqueName: \"kubernetes.io/projected/6712cc3f-2e68-4522-8624-84903da6d19d-kube-api-access-qgtl7\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.141906 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.142622 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.144526 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.145938 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.429227 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a6edec94-977f-493a-97b6-f90e02d07467","Type":"ContainerDied","Data":"dc6950657c5ab8fb8999ca939674220f8a785809d3a9cd68c61e8c9952acf6bc"} Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.429292 4941 scope.go:117] "RemoveContainer" containerID="4538e4358b9525e75783e8afd80be20d7156046fb83eecb859f5b485afa2078b" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.429497 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.430305 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.430393 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.442428 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.484954 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.512500 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.521814 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.533225 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.550890 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 07:15:32 crc kubenswrapper[4941]: E0307 07:15:32.551477 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6edec94-977f-493a-97b6-f90e02d07467" containerName="nova-cell1-novncproxy-novncproxy" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.551503 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6edec94-977f-493a-97b6-f90e02d07467" containerName="nova-cell1-novncproxy-novncproxy" Mar 07 07:15:32 crc kubenswrapper[4941]: E0307 07:15:32.551546 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6712cc3f-2e68-4522-8624-84903da6d19d" containerName="nova-metadata-metadata" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.551555 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6712cc3f-2e68-4522-8624-84903da6d19d" containerName="nova-metadata-metadata" Mar 07 07:15:32 crc kubenswrapper[4941]: E0307 07:15:32.551572 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6712cc3f-2e68-4522-8624-84903da6d19d" containerName="nova-metadata-log" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.551580 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6712cc3f-2e68-4522-8624-84903da6d19d" containerName="nova-metadata-log" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.551829 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6712cc3f-2e68-4522-8624-84903da6d19d" containerName="nova-metadata-metadata" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.551853 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6712cc3f-2e68-4522-8624-84903da6d19d" containerName="nova-metadata-log" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.551885 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6edec94-977f-493a-97b6-f90e02d07467" containerName="nova-cell1-novncproxy-novncproxy" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.552700 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.558573 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.558857 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.559033 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.562748 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.564469 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.569308 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.569321 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.575777 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.592394 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.689438 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9dd56c4d5-xmpfq"] Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.691836 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.710909 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9dd56c4d5-xmpfq"] Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.719330 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-786b9\" (UniqueName: \"kubernetes.io/projected/056debcc-d271-4ea1-a70c-fc67794f060e-kube-api-access-786b9\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.719640 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.719865 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.720009 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.720134 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.720289 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-config-data\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.720385 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.720505 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e4233a-6075-4eab-8cc4-e1fa1e892931-logs\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.720632 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.720755 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx8gh\" (UniqueName: \"kubernetes.io/projected/52e4233a-6075-4eab-8cc4-e1fa1e892931-kube-api-access-cx8gh\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.824354 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.824451 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-config\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.824501 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-config-data\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.824527 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.824545 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e4233a-6075-4eab-8cc4-e1fa1e892931-logs\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.824561 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.824576 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8gh\" (UniqueName: \"kubernetes.io/projected/52e4233a-6075-4eab-8cc4-e1fa1e892931-kube-api-access-cx8gh\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.824594 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-dns-svc\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.824613 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-786b9\" (UniqueName: \"kubernetes.io/projected/056debcc-d271-4ea1-a70c-fc67794f060e-kube-api-access-786b9\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.824636 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.825548 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e4233a-6075-4eab-8cc4-e1fa1e892931-logs\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.825751 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-ovsdbserver-nb\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.825885 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-ovsdbserver-sb\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.825915 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-dns-swift-storage-0\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.826083 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5k24\" (UniqueName: \"kubernetes.io/projected/1198c32e-6783-497e-a232-5dd01865ecfd-kube-api-access-n5k24\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.826183 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.826264 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.829084 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.829116 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.829660 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-config-data\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.830054 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.830875 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.838963 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.841638 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.843247 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx8gh\" (UniqueName: \"kubernetes.io/projected/52e4233a-6075-4eab-8cc4-e1fa1e892931-kube-api-access-cx8gh\") pod \"nova-metadata-0\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.844894 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-786b9\" (UniqueName: \"kubernetes.io/projected/056debcc-d271-4ea1-a70c-fc67794f060e-kube-api-access-786b9\") pod \"nova-cell1-novncproxy-0\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.891260 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.914707 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.927109 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5k24\" (UniqueName: \"kubernetes.io/projected/1198c32e-6783-497e-a232-5dd01865ecfd-kube-api-access-n5k24\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.927207 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-config\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.927267 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-dns-svc\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.927307 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-ovsdbserver-nb\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.927344 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-ovsdbserver-sb\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.927363 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-dns-swift-storage-0\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.928353 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-dns-svc\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.928356 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-dns-swift-storage-0\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.928557 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-config\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.928586 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-ovsdbserver-nb\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.928974 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-ovsdbserver-sb\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:32 crc kubenswrapper[4941]: I0307 07:15:32.951366 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5k24\" (UniqueName: \"kubernetes.io/projected/1198c32e-6783-497e-a232-5dd01865ecfd-kube-api-access-n5k24\") pod \"dnsmasq-dns-9dd56c4d5-xmpfq\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:33 crc kubenswrapper[4941]: I0307 07:15:33.039819 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:33 crc kubenswrapper[4941]: I0307 07:15:33.413879 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 07:15:33 crc kubenswrapper[4941]: W0307 07:15:33.415424 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod056debcc_d271_4ea1_a70c_fc67794f060e.slice/crio-5bdda605773738a0c60ba4bd0138471e63090a8ed2166ba138afc3d68b2ba191 WatchSource:0}: Error finding container 5bdda605773738a0c60ba4bd0138471e63090a8ed2166ba138afc3d68b2ba191: Status 404 returned error can't find the container with id 5bdda605773738a0c60ba4bd0138471e63090a8ed2166ba138afc3d68b2ba191 Mar 07 07:15:33 crc kubenswrapper[4941]: I0307 07:15:33.455208 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"056debcc-d271-4ea1-a70c-fc67794f060e","Type":"ContainerStarted","Data":"5bdda605773738a0c60ba4bd0138471e63090a8ed2166ba138afc3d68b2ba191"} Mar 07 07:15:33 crc kubenswrapper[4941]: I0307 07:15:33.509267 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:15:33 crc kubenswrapper[4941]: I0307 07:15:33.611871 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9dd56c4d5-xmpfq"] Mar 07 07:15:33 crc kubenswrapper[4941]: W0307 07:15:33.616648 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1198c32e_6783_497e_a232_5dd01865ecfd.slice/crio-167e36efc1cb87a7e1b8c0a627ff3357c9836da8620dd3d881931aab794b18b0 WatchSource:0}: Error finding container 167e36efc1cb87a7e1b8c0a627ff3357c9836da8620dd3d881931aab794b18b0: Status 404 returned error can't find the container with id 167e36efc1cb87a7e1b8c0a627ff3357c9836da8620dd3d881931aab794b18b0 Mar 07 07:15:33 crc kubenswrapper[4941]: I0307 07:15:33.967708 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6712cc3f-2e68-4522-8624-84903da6d19d" path="/var/lib/kubelet/pods/6712cc3f-2e68-4522-8624-84903da6d19d/volumes" Mar 07 07:15:33 crc kubenswrapper[4941]: I0307 07:15:33.968715 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6edec94-977f-493a-97b6-f90e02d07467" path="/var/lib/kubelet/pods/a6edec94-977f-493a-97b6-f90e02d07467/volumes" Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.465104 4941 generic.go:334] "Generic (PLEG): container finished" podID="1198c32e-6783-497e-a232-5dd01865ecfd" containerID="60388b14da3f3917c16c54cde22e16752e910c2ccf741bf0069462a58ed79601" exitCode=0 Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.465170 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" event={"ID":"1198c32e-6783-497e-a232-5dd01865ecfd","Type":"ContainerDied","Data":"60388b14da3f3917c16c54cde22e16752e910c2ccf741bf0069462a58ed79601"} Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.465196 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" event={"ID":"1198c32e-6783-497e-a232-5dd01865ecfd","Type":"ContainerStarted","Data":"167e36efc1cb87a7e1b8c0a627ff3357c9836da8620dd3d881931aab794b18b0"} Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.466766 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52e4233a-6075-4eab-8cc4-e1fa1e892931","Type":"ContainerStarted","Data":"38e8e4193196c14a7e10d291fea5edff967e237026d299d5f89cf712cfe271b3"} Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.466818 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52e4233a-6075-4eab-8cc4-e1fa1e892931","Type":"ContainerStarted","Data":"039e99f324008f83778a7c05ed45ce8a039837cc1194b4d5636e92f100f35b8d"} Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.466832 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52e4233a-6075-4eab-8cc4-e1fa1e892931","Type":"ContainerStarted","Data":"46085642a25947b7edcc993722705214c8a8d1e89ce615a82fe352c445f5a486"} Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.469985 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"056debcc-d271-4ea1-a70c-fc67794f060e","Type":"ContainerStarted","Data":"d6c2f62c9103f19083c550098257ae768e69623eab5370a23a6b39d03261c98b"} Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.512853 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.512830136 podStartE2EDuration="2.512830136s" podCreationTimestamp="2026-03-07 07:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:15:34.504865385 +0000 UTC m=+1431.457230860" watchObservedRunningTime="2026-03-07 07:15:34.512830136 +0000 UTC m=+1431.465195601" Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.528713 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.528691617 podStartE2EDuration="2.528691617s" podCreationTimestamp="2026-03-07 07:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:15:34.523730032 +0000 UTC m=+1431.476095497" watchObservedRunningTime="2026-03-07 07:15:34.528691617 +0000 UTC m=+1431.481057082" Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.676215 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.676879 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="ceilometer-central-agent" containerID="cri-o://e95d284d1897da3150166394fed85abe100a84be0cf4a78984cff4e5dda880f9" gracePeriod=30 Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.676999 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="proxy-httpd" containerID="cri-o://1e6a51c773f67d6c421a2aa2916d5fef7dc6722f7f17013720f33cfe20edd934" gracePeriod=30 Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.677039 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="sg-core" containerID="cri-o://1129d89f81b40b4d80c583a41cfcad8c875d8ecf9fd33d49b88c5c50f4cb4147" gracePeriod=30 Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.677070 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="ceilometer-notification-agent" containerID="cri-o://11566e65a4e20367dbc4e401b2b98d6fce48d8fcaf9c840c54728f3b24d8e648" gracePeriod=30 Mar 07 07:15:34 crc kubenswrapper[4941]: I0307 07:15:34.785652 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.197:3000/\": read tcp 10.217.0.2:55628->10.217.0.197:3000: read: connection reset by peer" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.485820 4941 generic.go:334] "Generic (PLEG): container finished" podID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerID="1e6a51c773f67d6c421a2aa2916d5fef7dc6722f7f17013720f33cfe20edd934" exitCode=0 Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.486134 4941 generic.go:334] "Generic (PLEG): container finished" podID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerID="1129d89f81b40b4d80c583a41cfcad8c875d8ecf9fd33d49b88c5c50f4cb4147" exitCode=2 Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.486143 4941 generic.go:334] "Generic (PLEG): container finished" podID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerID="11566e65a4e20367dbc4e401b2b98d6fce48d8fcaf9c840c54728f3b24d8e648" exitCode=0 Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.486150 4941 generic.go:334] "Generic (PLEG): container finished" podID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerID="e95d284d1897da3150166394fed85abe100a84be0cf4a78984cff4e5dda880f9" exitCode=0 Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.486188 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046","Type":"ContainerDied","Data":"1e6a51c773f67d6c421a2aa2916d5fef7dc6722f7f17013720f33cfe20edd934"} Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.486214 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046","Type":"ContainerDied","Data":"1129d89f81b40b4d80c583a41cfcad8c875d8ecf9fd33d49b88c5c50f4cb4147"} Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.486224 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046","Type":"ContainerDied","Data":"11566e65a4e20367dbc4e401b2b98d6fce48d8fcaf9c840c54728f3b24d8e648"} Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.486231 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046","Type":"ContainerDied","Data":"e95d284d1897da3150166394fed85abe100a84be0cf4a78984cff4e5dda880f9"} Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.489393 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" event={"ID":"1198c32e-6783-497e-a232-5dd01865ecfd","Type":"ContainerStarted","Data":"5125f9279c7ec81414bff41771650af0c2b88b60d9d5f2957f576c5e9948b646"} Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.489626 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.520638 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" podStartSLOduration=3.5206197169999998 podStartE2EDuration="3.520619717s" podCreationTimestamp="2026-03-07 07:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:15:35.517365345 +0000 UTC m=+1432.469730810" watchObservedRunningTime="2026-03-07 07:15:35.520619717 +0000 UTC m=+1432.472985192" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.599863 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.600064 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4cd8a66f-ad94-48d8-adc6-e71e962db352" containerName="nova-api-log" containerID="cri-o://c679ce65c7e6eb73c460b500d1f032651e34e86afd48d7f5e434e3fe618a13b6" gracePeriod=30 Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.600512 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4cd8a66f-ad94-48d8-adc6-e71e962db352" containerName="nova-api-api" containerID="cri-o://c42bfc87bb5fb1a2d9862a61bbd0ce45dc6e2c2fcb10ff527c61fc228dfc8ddb" gracePeriod=30 Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.693778 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.878827 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-combined-ca-bundle\") pod \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.878914 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-config-data\") pod \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.879029 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-sg-core-conf-yaml\") pod \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.879057 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-run-httpd\") pod \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.879098 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-log-httpd\") pod \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.879172 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-scripts\") pod \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.879199 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsgwh\" (UniqueName: \"kubernetes.io/projected/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-kube-api-access-xsgwh\") pod \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\" (UID: \"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046\") " Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.880169 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" (UID: "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.880601 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" (UID: "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.884685 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-scripts" (OuterVolumeSpecName: "scripts") pod "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" (UID: "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.885045 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-kube-api-access-xsgwh" (OuterVolumeSpecName: "kube-api-access-xsgwh") pod "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" (UID: "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046"). InnerVolumeSpecName "kube-api-access-xsgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.909005 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" (UID: "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.970836 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" (UID: "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.981880 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.981912 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsgwh\" (UniqueName: \"kubernetes.io/projected/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-kube-api-access-xsgwh\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.981924 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.981934 4941 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.981944 4941 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.981955 4941 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:35 crc kubenswrapper[4941]: I0307 07:15:35.992872 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-config-data" (OuterVolumeSpecName: "config-data") pod "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" (UID: "dfc6ede7-cdf5-4bae-a384-cb9a46fb5046"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.083345 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.499147 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.499655 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfc6ede7-cdf5-4bae-a384-cb9a46fb5046","Type":"ContainerDied","Data":"15b2f9112e2570586f2813aa1e335fef526cc88403c0d98849f31049fa12b4e8"} Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.500533 4941 scope.go:117] "RemoveContainer" containerID="1e6a51c773f67d6c421a2aa2916d5fef7dc6722f7f17013720f33cfe20edd934" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.502741 4941 generic.go:334] "Generic (PLEG): container finished" podID="4cd8a66f-ad94-48d8-adc6-e71e962db352" containerID="c679ce65c7e6eb73c460b500d1f032651e34e86afd48d7f5e434e3fe618a13b6" exitCode=143 Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.503681 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cd8a66f-ad94-48d8-adc6-e71e962db352","Type":"ContainerDied","Data":"c679ce65c7e6eb73c460b500d1f032651e34e86afd48d7f5e434e3fe618a13b6"} Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.527272 4941 scope.go:117] "RemoveContainer" containerID="1129d89f81b40b4d80c583a41cfcad8c875d8ecf9fd33d49b88c5c50f4cb4147" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.556526 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.567511 4941 scope.go:117] "RemoveContainer" containerID="11566e65a4e20367dbc4e401b2b98d6fce48d8fcaf9c840c54728f3b24d8e648" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.600195 4941 scope.go:117] "RemoveContainer" containerID="e95d284d1897da3150166394fed85abe100a84be0cf4a78984cff4e5dda880f9" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.622520 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.626076 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:36 crc kubenswrapper[4941]: E0307 07:15:36.626586 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="ceilometer-notification-agent" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.626604 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="ceilometer-notification-agent" Mar 07 07:15:36 crc kubenswrapper[4941]: E0307 07:15:36.626631 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="proxy-httpd" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.626638 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="proxy-httpd" Mar 07 07:15:36 crc kubenswrapper[4941]: E0307 07:15:36.626648 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="sg-core" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.626655 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="sg-core" Mar 07 07:15:36 crc kubenswrapper[4941]: E0307 07:15:36.626675 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="ceilometer-central-agent" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.626682 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="ceilometer-central-agent" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.626891 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="sg-core" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.626911 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="ceilometer-central-agent" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.626924 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="proxy-httpd" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.626937 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" containerName="ceilometer-notification-agent" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.629000 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.632453 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.632586 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.640769 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.763163 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:36 crc kubenswrapper[4941]: E0307 07:15:36.764001 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-llxl8 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="bb2a3f1f-9329-4da0-90b4-21060470b45b" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.798478 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.798575 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f1f-9329-4da0-90b4-21060470b45b-log-httpd\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.798668 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.798743 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-scripts\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.798780 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-config-data\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.798810 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f1f-9329-4da0-90b4-21060470b45b-run-httpd\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.798841 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llxl8\" (UniqueName: \"kubernetes.io/projected/bb2a3f1f-9329-4da0-90b4-21060470b45b-kube-api-access-llxl8\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.900870 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f1f-9329-4da0-90b4-21060470b45b-log-httpd\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.900954 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.900986 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-scripts\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.901010 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-config-data\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.901029 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f1f-9329-4da0-90b4-21060470b45b-run-httpd\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.901050 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llxl8\" (UniqueName: \"kubernetes.io/projected/bb2a3f1f-9329-4da0-90b4-21060470b45b-kube-api-access-llxl8\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.901108 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.901985 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f1f-9329-4da0-90b4-21060470b45b-log-httpd\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.902747 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f1f-9329-4da0-90b4-21060470b45b-run-httpd\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.906624 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.907577 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-config-data\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.915629 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.916531 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-scripts\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:36 crc kubenswrapper[4941]: I0307 07:15:36.919441 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llxl8\" (UniqueName: \"kubernetes.io/projected/bb2a3f1f-9329-4da0-90b4-21060470b45b-kube-api-access-llxl8\") pod \"ceilometer-0\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " pod="openstack/ceilometer-0" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.525153 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.540515 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.724950 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f1f-9329-4da0-90b4-21060470b45b-log-httpd\") pod \"bb2a3f1f-9329-4da0-90b4-21060470b45b\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.725088 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-config-data\") pod \"bb2a3f1f-9329-4da0-90b4-21060470b45b\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.725115 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llxl8\" (UniqueName: \"kubernetes.io/projected/bb2a3f1f-9329-4da0-90b4-21060470b45b-kube-api-access-llxl8\") pod \"bb2a3f1f-9329-4da0-90b4-21060470b45b\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.725194 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-scripts\") pod \"bb2a3f1f-9329-4da0-90b4-21060470b45b\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.725245 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-sg-core-conf-yaml\") pod \"bb2a3f1f-9329-4da0-90b4-21060470b45b\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.725315 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-combined-ca-bundle\") pod \"bb2a3f1f-9329-4da0-90b4-21060470b45b\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.725336 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f1f-9329-4da0-90b4-21060470b45b-run-httpd\") pod \"bb2a3f1f-9329-4da0-90b4-21060470b45b\" (UID: \"bb2a3f1f-9329-4da0-90b4-21060470b45b\") " Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.725933 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2a3f1f-9329-4da0-90b4-21060470b45b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bb2a3f1f-9329-4da0-90b4-21060470b45b" (UID: "bb2a3f1f-9329-4da0-90b4-21060470b45b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.726133 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2a3f1f-9329-4da0-90b4-21060470b45b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bb2a3f1f-9329-4da0-90b4-21060470b45b" (UID: "bb2a3f1f-9329-4da0-90b4-21060470b45b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.730595 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-config-data" (OuterVolumeSpecName: "config-data") pod "bb2a3f1f-9329-4da0-90b4-21060470b45b" (UID: "bb2a3f1f-9329-4da0-90b4-21060470b45b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.731440 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-scripts" (OuterVolumeSpecName: "scripts") pod "bb2a3f1f-9329-4da0-90b4-21060470b45b" (UID: "bb2a3f1f-9329-4da0-90b4-21060470b45b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.732838 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2a3f1f-9329-4da0-90b4-21060470b45b-kube-api-access-llxl8" (OuterVolumeSpecName: "kube-api-access-llxl8") pod "bb2a3f1f-9329-4da0-90b4-21060470b45b" (UID: "bb2a3f1f-9329-4da0-90b4-21060470b45b"). InnerVolumeSpecName "kube-api-access-llxl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.733582 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bb2a3f1f-9329-4da0-90b4-21060470b45b" (UID: "bb2a3f1f-9329-4da0-90b4-21060470b45b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.735918 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb2a3f1f-9329-4da0-90b4-21060470b45b" (UID: "bb2a3f1f-9329-4da0-90b4-21060470b45b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.828231 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llxl8\" (UniqueName: \"kubernetes.io/projected/bb2a3f1f-9329-4da0-90b4-21060470b45b-kube-api-access-llxl8\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.828278 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.828297 4941 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.828316 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.828333 4941 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f1f-9329-4da0-90b4-21060470b45b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.828350 4941 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f1f-9329-4da0-90b4-21060470b45b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.828367 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2a3f1f-9329-4da0-90b4-21060470b45b-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.895331 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.915192 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.915686 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 07 07:15:37 crc kubenswrapper[4941]: I0307 07:15:37.971054 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfc6ede7-cdf5-4bae-a384-cb9a46fb5046" path="/var/lib/kubelet/pods/dfc6ede7-cdf5-4bae-a384-cb9a46fb5046/volumes" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.533208 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.593806 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.600585 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.616710 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.619977 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.623550 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.624268 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.630634 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.746500 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-log-httpd\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.746719 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-config-data\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.746828 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-scripts\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.746952 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lzkl\" (UniqueName: \"kubernetes.io/projected/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-kube-api-access-9lzkl\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.747072 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-run-httpd\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.747148 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.747216 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.849870 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lzkl\" (UniqueName: \"kubernetes.io/projected/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-kube-api-access-9lzkl\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.850680 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-run-httpd\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.851486 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-run-httpd\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.851698 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.855188 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.855464 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-log-httpd\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.855530 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-config-data\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.855573 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-scripts\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.856423 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-log-httpd\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.857875 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.859421 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.862065 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-config-data\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.863295 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-scripts\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.870497 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lzkl\" (UniqueName: \"kubernetes.io/projected/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-kube-api-access-9lzkl\") pod \"ceilometer-0\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " pod="openstack/ceilometer-0" Mar 07 07:15:38 crc kubenswrapper[4941]: I0307 07:15:38.948000 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.179921 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.368321 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cd8a66f-ad94-48d8-adc6-e71e962db352-logs\") pod \"4cd8a66f-ad94-48d8-adc6-e71e962db352\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.368392 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsmh6\" (UniqueName: \"kubernetes.io/projected/4cd8a66f-ad94-48d8-adc6-e71e962db352-kube-api-access-lsmh6\") pod \"4cd8a66f-ad94-48d8-adc6-e71e962db352\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.368469 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd8a66f-ad94-48d8-adc6-e71e962db352-config-data\") pod \"4cd8a66f-ad94-48d8-adc6-e71e962db352\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.368743 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd8a66f-ad94-48d8-adc6-e71e962db352-combined-ca-bundle\") pod \"4cd8a66f-ad94-48d8-adc6-e71e962db352\" (UID: \"4cd8a66f-ad94-48d8-adc6-e71e962db352\") " Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.369017 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd8a66f-ad94-48d8-adc6-e71e962db352-logs" (OuterVolumeSpecName: "logs") pod "4cd8a66f-ad94-48d8-adc6-e71e962db352" (UID: "4cd8a66f-ad94-48d8-adc6-e71e962db352"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.369342 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cd8a66f-ad94-48d8-adc6-e71e962db352-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.373989 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd8a66f-ad94-48d8-adc6-e71e962db352-kube-api-access-lsmh6" (OuterVolumeSpecName: "kube-api-access-lsmh6") pod "4cd8a66f-ad94-48d8-adc6-e71e962db352" (UID: "4cd8a66f-ad94-48d8-adc6-e71e962db352"). InnerVolumeSpecName "kube-api-access-lsmh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.400565 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd8a66f-ad94-48d8-adc6-e71e962db352-config-data" (OuterVolumeSpecName: "config-data") pod "4cd8a66f-ad94-48d8-adc6-e71e962db352" (UID: "4cd8a66f-ad94-48d8-adc6-e71e962db352"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.422246 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd8a66f-ad94-48d8-adc6-e71e962db352-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cd8a66f-ad94-48d8-adc6-e71e962db352" (UID: "4cd8a66f-ad94-48d8-adc6-e71e962db352"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.430068 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:15:39 crc kubenswrapper[4941]: W0307 07:15:39.431666 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3ea6f6c_45a7_4071_8783_25a2fc52e5e3.slice/crio-33a90ae920faa6a7a10efa1f0537e285f3d0ca62528cdb38070277f156364966 WatchSource:0}: Error finding container 33a90ae920faa6a7a10efa1f0537e285f3d0ca62528cdb38070277f156364966: Status 404 returned error can't find the container with id 33a90ae920faa6a7a10efa1f0537e285f3d0ca62528cdb38070277f156364966 Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.471633 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsmh6\" (UniqueName: \"kubernetes.io/projected/4cd8a66f-ad94-48d8-adc6-e71e962db352-kube-api-access-lsmh6\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.471675 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd8a66f-ad94-48d8-adc6-e71e962db352-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.471685 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd8a66f-ad94-48d8-adc6-e71e962db352-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.544333 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3","Type":"ContainerStarted","Data":"33a90ae920faa6a7a10efa1f0537e285f3d0ca62528cdb38070277f156364966"} Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.547064 4941 generic.go:334] "Generic (PLEG): container finished" podID="4cd8a66f-ad94-48d8-adc6-e71e962db352" containerID="c42bfc87bb5fb1a2d9862a61bbd0ce45dc6e2c2fcb10ff527c61fc228dfc8ddb" exitCode=0 Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.547119 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cd8a66f-ad94-48d8-adc6-e71e962db352","Type":"ContainerDied","Data":"c42bfc87bb5fb1a2d9862a61bbd0ce45dc6e2c2fcb10ff527c61fc228dfc8ddb"} Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.547131 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.547157 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cd8a66f-ad94-48d8-adc6-e71e962db352","Type":"ContainerDied","Data":"451fa123a8125c075e8f8ecb1395c3c1ed5b9df4559b31c289a4eb24656f1d44"} Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.547179 4941 scope.go:117] "RemoveContainer" containerID="c42bfc87bb5fb1a2d9862a61bbd0ce45dc6e2c2fcb10ff527c61fc228dfc8ddb" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.576965 4941 scope.go:117] "RemoveContainer" containerID="c679ce65c7e6eb73c460b500d1f032651e34e86afd48d7f5e434e3fe618a13b6" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.627984 4941 scope.go:117] "RemoveContainer" containerID="c42bfc87bb5fb1a2d9862a61bbd0ce45dc6e2c2fcb10ff527c61fc228dfc8ddb" Mar 07 07:15:39 crc kubenswrapper[4941]: E0307 07:15:39.628567 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c42bfc87bb5fb1a2d9862a61bbd0ce45dc6e2c2fcb10ff527c61fc228dfc8ddb\": container with ID starting with c42bfc87bb5fb1a2d9862a61bbd0ce45dc6e2c2fcb10ff527c61fc228dfc8ddb not found: ID does not exist" containerID="c42bfc87bb5fb1a2d9862a61bbd0ce45dc6e2c2fcb10ff527c61fc228dfc8ddb" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.628603 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c42bfc87bb5fb1a2d9862a61bbd0ce45dc6e2c2fcb10ff527c61fc228dfc8ddb"} err="failed to get container status \"c42bfc87bb5fb1a2d9862a61bbd0ce45dc6e2c2fcb10ff527c61fc228dfc8ddb\": rpc error: code = NotFound desc = could not find container \"c42bfc87bb5fb1a2d9862a61bbd0ce45dc6e2c2fcb10ff527c61fc228dfc8ddb\": container with ID starting with c42bfc87bb5fb1a2d9862a61bbd0ce45dc6e2c2fcb10ff527c61fc228dfc8ddb not found: ID does not exist" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.628629 4941 scope.go:117] "RemoveContainer" containerID="c679ce65c7e6eb73c460b500d1f032651e34e86afd48d7f5e434e3fe618a13b6" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.630484 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:39 crc kubenswrapper[4941]: E0307 07:15:39.632770 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c679ce65c7e6eb73c460b500d1f032651e34e86afd48d7f5e434e3fe618a13b6\": container with ID starting with c679ce65c7e6eb73c460b500d1f032651e34e86afd48d7f5e434e3fe618a13b6 not found: ID does not exist" containerID="c679ce65c7e6eb73c460b500d1f032651e34e86afd48d7f5e434e3fe618a13b6" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.632812 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c679ce65c7e6eb73c460b500d1f032651e34e86afd48d7f5e434e3fe618a13b6"} err="failed to get container status \"c679ce65c7e6eb73c460b500d1f032651e34e86afd48d7f5e434e3fe618a13b6\": rpc error: code = NotFound desc = could not find container \"c679ce65c7e6eb73c460b500d1f032651e34e86afd48d7f5e434e3fe618a13b6\": container with ID starting with c679ce65c7e6eb73c460b500d1f032651e34e86afd48d7f5e434e3fe618a13b6 not found: ID does not exist" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.639209 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.646280 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:39 crc kubenswrapper[4941]: E0307 07:15:39.646685 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd8a66f-ad94-48d8-adc6-e71e962db352" containerName="nova-api-api" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.646705 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd8a66f-ad94-48d8-adc6-e71e962db352" containerName="nova-api-api" Mar 07 07:15:39 crc kubenswrapper[4941]: E0307 07:15:39.646721 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd8a66f-ad94-48d8-adc6-e71e962db352" containerName="nova-api-log" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.646728 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd8a66f-ad94-48d8-adc6-e71e962db352" containerName="nova-api-log" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.646894 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd8a66f-ad94-48d8-adc6-e71e962db352" containerName="nova-api-api" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.646905 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd8a66f-ad94-48d8-adc6-e71e962db352" containerName="nova-api-log" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.647844 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.649859 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.650005 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.651459 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.654883 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.698719 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5h9p\" (UniqueName: \"kubernetes.io/projected/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-kube-api-access-t5h9p\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.699044 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.699315 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-public-tls-certs\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.699379 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-config-data\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.699413 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.699471 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-logs\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.801002 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-public-tls-certs\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.801056 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-config-data\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.801081 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.801108 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-logs\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.801167 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5h9p\" (UniqueName: \"kubernetes.io/projected/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-kube-api-access-t5h9p\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.801247 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.802193 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-logs\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.805551 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-config-data\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.805588 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-public-tls-certs\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.807559 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.808331 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.823438 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5h9p\" (UniqueName: \"kubernetes.io/projected/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-kube-api-access-t5h9p\") pod \"nova-api-0\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " pod="openstack/nova-api-0" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.966264 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd8a66f-ad94-48d8-adc6-e71e962db352" path="/var/lib/kubelet/pods/4cd8a66f-ad94-48d8-adc6-e71e962db352/volumes" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.966901 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb2a3f1f-9329-4da0-90b4-21060470b45b" path="/var/lib/kubelet/pods/bb2a3f1f-9329-4da0-90b4-21060470b45b/volumes" Mar 07 07:15:39 crc kubenswrapper[4941]: I0307 07:15:39.992700 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:15:40 crc kubenswrapper[4941]: W0307 07:15:40.436219 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb96d8945_3e4c_4bb2_9fbc_383b647cb27d.slice/crio-f6707638e25925b40ba89ccb7b0497c05c629790e4ef92c5ff04a2d63d7c90d1 WatchSource:0}: Error finding container f6707638e25925b40ba89ccb7b0497c05c629790e4ef92c5ff04a2d63d7c90d1: Status 404 returned error can't find the container with id f6707638e25925b40ba89ccb7b0497c05c629790e4ef92c5ff04a2d63d7c90d1 Mar 07 07:15:40 crc kubenswrapper[4941]: I0307 07:15:40.437693 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:40 crc kubenswrapper[4941]: I0307 07:15:40.571340 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3","Type":"ContainerStarted","Data":"361d398e8fc72c653035917cc9b68c1757d8c3c0e6a0df9e4f2f5f42e6a187b4"} Mar 07 07:15:40 crc kubenswrapper[4941]: I0307 07:15:40.573157 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b96d8945-3e4c-4bb2-9fbc-383b647cb27d","Type":"ContainerStarted","Data":"f6707638e25925b40ba89ccb7b0497c05c629790e4ef92c5ff04a2d63d7c90d1"} Mar 07 07:15:41 crc kubenswrapper[4941]: I0307 07:15:41.582890 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b96d8945-3e4c-4bb2-9fbc-383b647cb27d","Type":"ContainerStarted","Data":"77238f398d9cee130b0d8df3e854f06a7d3561160a1d4734d8cb9fe5c49a2ef4"} Mar 07 07:15:41 crc kubenswrapper[4941]: I0307 07:15:41.583547 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b96d8945-3e4c-4bb2-9fbc-383b647cb27d","Type":"ContainerStarted","Data":"2fc64ffb161a17f8b6fc7ab6d0b9b60be88be7f217fa7c46acc2f088fc9117fb"} Mar 07 07:15:41 crc kubenswrapper[4941]: I0307 07:15:41.585906 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3","Type":"ContainerStarted","Data":"7a36b56ccfdf2599202a3e46368a0bcacd3caccbe64a1360dfc5993ca0555bbd"} Mar 07 07:15:42 crc kubenswrapper[4941]: I0307 07:15:42.601948 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3","Type":"ContainerStarted","Data":"e06f7599440d38a4a93fa7deb481a34fe8928621b5a45e53d11879ac203ec92c"} Mar 07 07:15:42 crc kubenswrapper[4941]: I0307 07:15:42.891558 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:42 crc kubenswrapper[4941]: I0307 07:15:42.916225 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 07 07:15:42 crc kubenswrapper[4941]: I0307 07:15:42.916265 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 07 07:15:42 crc kubenswrapper[4941]: I0307 07:15:42.924145 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:42 crc kubenswrapper[4941]: I0307 07:15:42.943095 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.943070455 podStartE2EDuration="3.943070455s" podCreationTimestamp="2026-03-07 07:15:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:15:41.605879169 +0000 UTC m=+1438.558244644" watchObservedRunningTime="2026-03-07 07:15:42.943070455 +0000 UTC m=+1439.895435950" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.041527 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.168698 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-97cdf8549-fxq48"] Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.168936 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-97cdf8549-fxq48" podUID="90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" containerName="dnsmasq-dns" containerID="cri-o://b2176482b6780188feae01350fd7d7388cc5f70b264e779954fd672db582c776" gracePeriod=10 Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.645412 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3","Type":"ContainerStarted","Data":"b4f512e0aa7ca8a4ed5bf77ec13dfb0ab712fd25b64fb2f6461583106c2b0186"} Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.646450 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.661477 4941 generic.go:334] "Generic (PLEG): container finished" podID="90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" containerID="b2176482b6780188feae01350fd7d7388cc5f70b264e779954fd672db582c776" exitCode=0 Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.662680 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-97cdf8549-fxq48" event={"ID":"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d","Type":"ContainerDied","Data":"b2176482b6780188feae01350fd7d7388cc5f70b264e779954fd672db582c776"} Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.688677 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.694798 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.713772 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.966520652 podStartE2EDuration="5.713753031s" podCreationTimestamp="2026-03-07 07:15:38 +0000 UTC" firstStartedPulling="2026-03-07 07:15:39.435149448 +0000 UTC m=+1436.387514913" lastFinishedPulling="2026-03-07 07:15:43.182381827 +0000 UTC m=+1440.134747292" observedRunningTime="2026-03-07 07:15:43.671137445 +0000 UTC m=+1440.623502920" watchObservedRunningTime="2026-03-07 07:15:43.713753031 +0000 UTC m=+1440.666118496" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.831832 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-ppswq"] Mar 07 07:15:43 crc kubenswrapper[4941]: E0307 07:15:43.832613 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" containerName="init" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.832629 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" containerName="init" Mar 07 07:15:43 crc kubenswrapper[4941]: E0307 07:15:43.832660 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" containerName="dnsmasq-dns" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.832666 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" containerName="dnsmasq-dns" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.832850 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" containerName="dnsmasq-dns" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.833459 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.836294 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.836771 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.841491 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ppswq"] Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.875747 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-ovsdbserver-nb\") pod \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.875780 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-dns-swift-storage-0\") pod \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.875842 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-ovsdbserver-sb\") pod \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.875881 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-dns-svc\") pod \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.875943 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-config\") pod \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.875968 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5f75\" (UniqueName: \"kubernetes.io/projected/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-kube-api-access-d5f75\") pod \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\" (UID: \"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d\") " Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.882285 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-kube-api-access-d5f75" (OuterVolumeSpecName: "kube-api-access-d5f75") pod "90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" (UID: "90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d"). InnerVolumeSpecName "kube-api-access-d5f75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.926054 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" (UID: "90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.927633 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.927592 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.933185 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" (UID: "90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.934851 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" (UID: "90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.940761 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" (UID: "90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.942967 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-config" (OuterVolumeSpecName: "config") pod "90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" (UID: "90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.977956 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-scripts\") pod \"nova-cell1-cell-mapping-ppswq\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.978038 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ppswq\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.978097 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-config-data\") pod \"nova-cell1-cell-mapping-ppswq\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.978183 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh9tz\" (UniqueName: \"kubernetes.io/projected/5ef55d7b-622e-4660-bc2a-990353dae291-kube-api-access-mh9tz\") pod \"nova-cell1-cell-mapping-ppswq\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.978274 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.978285 4941 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.978294 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.978305 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.978315 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:43 crc kubenswrapper[4941]: I0307 07:15:43.978325 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5f75\" (UniqueName: \"kubernetes.io/projected/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d-kube-api-access-d5f75\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.079649 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ppswq\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.080422 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-config-data\") pod \"nova-cell1-cell-mapping-ppswq\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.080631 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh9tz\" (UniqueName: \"kubernetes.io/projected/5ef55d7b-622e-4660-bc2a-990353dae291-kube-api-access-mh9tz\") pod \"nova-cell1-cell-mapping-ppswq\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.080711 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-scripts\") pod \"nova-cell1-cell-mapping-ppswq\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.084805 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ppswq\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.087824 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.087980 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.100034 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-config-data\") pod \"nova-cell1-cell-mapping-ppswq\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.103296 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-scripts\") pod \"nova-cell1-cell-mapping-ppswq\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.115999 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh9tz\" (UniqueName: \"kubernetes.io/projected/5ef55d7b-622e-4660-bc2a-990353dae291-kube-api-access-mh9tz\") pod \"nova-cell1-cell-mapping-ppswq\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.154795 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.687142 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-97cdf8549-fxq48" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.687053 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-97cdf8549-fxq48" event={"ID":"90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d","Type":"ContainerDied","Data":"afec75b51812e99bc9cb9d8b7a72ad1a28d60db8ede1e30e8b75d4a67ca96ef1"} Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.687507 4941 scope.go:117] "RemoveContainer" containerID="b2176482b6780188feae01350fd7d7388cc5f70b264e779954fd672db582c776" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.744312 4941 scope.go:117] "RemoveContainer" containerID="782b4cb6800a49583e5d4febde0e2f055e88966b8bd69833d2b9828e747c886f" Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.759859 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-97cdf8549-fxq48"] Mar 07 07:15:44 crc kubenswrapper[4941]: I0307 07:15:44.771690 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-97cdf8549-fxq48"] Mar 07 07:15:45 crc kubenswrapper[4941]: I0307 07:15:45.249613 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ppswq"] Mar 07 07:15:45 crc kubenswrapper[4941]: I0307 07:15:45.708050 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ppswq" event={"ID":"5ef55d7b-622e-4660-bc2a-990353dae291","Type":"ContainerStarted","Data":"4dfaa1caa1c945cdadc6069017eb59bb909a3a544cec38ebaf0cb8113f6668da"} Mar 07 07:15:45 crc kubenswrapper[4941]: I0307 07:15:45.708530 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ppswq" event={"ID":"5ef55d7b-622e-4660-bc2a-990353dae291","Type":"ContainerStarted","Data":"168668ea3dbd55fe03b612de8eede2db36d79ffd6bec32988f0e9c15c45166c0"} Mar 07 07:15:45 crc kubenswrapper[4941]: I0307 07:15:45.975038 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d" path="/var/lib/kubelet/pods/90fbfdfb-e78c-4f2c-bfc5-5b8e6597637d/volumes" Mar 07 07:15:49 crc kubenswrapper[4941]: I0307 07:15:49.993225 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 07:15:49 crc kubenswrapper[4941]: I0307 07:15:49.993750 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 07:15:50 crc kubenswrapper[4941]: I0307 07:15:50.768615 4941 generic.go:334] "Generic (PLEG): container finished" podID="5ef55d7b-622e-4660-bc2a-990353dae291" containerID="4dfaa1caa1c945cdadc6069017eb59bb909a3a544cec38ebaf0cb8113f6668da" exitCode=0 Mar 07 07:15:50 crc kubenswrapper[4941]: I0307 07:15:50.768679 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ppswq" event={"ID":"5ef55d7b-622e-4660-bc2a-990353dae291","Type":"ContainerDied","Data":"4dfaa1caa1c945cdadc6069017eb59bb909a3a544cec38ebaf0cb8113f6668da"} Mar 07 07:15:51 crc kubenswrapper[4941]: I0307 07:15:51.005618 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b96d8945-3e4c-4bb2-9fbc-383b647cb27d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 07:15:51 crc kubenswrapper[4941]: I0307 07:15:51.005618 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b96d8945-3e4c-4bb2-9fbc-383b647cb27d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.281098 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.339637 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh9tz\" (UniqueName: \"kubernetes.io/projected/5ef55d7b-622e-4660-bc2a-990353dae291-kube-api-access-mh9tz\") pod \"5ef55d7b-622e-4660-bc2a-990353dae291\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.339759 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-combined-ca-bundle\") pod \"5ef55d7b-622e-4660-bc2a-990353dae291\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.339832 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-scripts\") pod \"5ef55d7b-622e-4660-bc2a-990353dae291\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.339902 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-config-data\") pod \"5ef55d7b-622e-4660-bc2a-990353dae291\" (UID: \"5ef55d7b-622e-4660-bc2a-990353dae291\") " Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.345998 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-scripts" (OuterVolumeSpecName: "scripts") pod "5ef55d7b-622e-4660-bc2a-990353dae291" (UID: "5ef55d7b-622e-4660-bc2a-990353dae291"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.346905 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef55d7b-622e-4660-bc2a-990353dae291-kube-api-access-mh9tz" (OuterVolumeSpecName: "kube-api-access-mh9tz") pod "5ef55d7b-622e-4660-bc2a-990353dae291" (UID: "5ef55d7b-622e-4660-bc2a-990353dae291"). InnerVolumeSpecName "kube-api-access-mh9tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.368237 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-config-data" (OuterVolumeSpecName: "config-data") pod "5ef55d7b-622e-4660-bc2a-990353dae291" (UID: "5ef55d7b-622e-4660-bc2a-990353dae291"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.370657 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ef55d7b-622e-4660-bc2a-990353dae291" (UID: "5ef55d7b-622e-4660-bc2a-990353dae291"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.441713 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh9tz\" (UniqueName: \"kubernetes.io/projected/5ef55d7b-622e-4660-bc2a-990353dae291-kube-api-access-mh9tz\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.441744 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.441753 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.441762 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef55d7b-622e-4660-bc2a-990353dae291-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.794105 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ppswq" event={"ID":"5ef55d7b-622e-4660-bc2a-990353dae291","Type":"ContainerDied","Data":"168668ea3dbd55fe03b612de8eede2db36d79ffd6bec32988f0e9c15c45166c0"} Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.794151 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="168668ea3dbd55fe03b612de8eede2db36d79ffd6bec32988f0e9c15c45166c0" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.794303 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ppswq" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.920468 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.924743 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 07 07:15:52 crc kubenswrapper[4941]: I0307 07:15:52.926982 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 07 07:15:53 crc kubenswrapper[4941]: I0307 07:15:53.021777 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:53 crc kubenswrapper[4941]: I0307 07:15:53.022157 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b96d8945-3e4c-4bb2-9fbc-383b647cb27d" containerName="nova-api-log" containerID="cri-o://2fc64ffb161a17f8b6fc7ab6d0b9b60be88be7f217fa7c46acc2f088fc9117fb" gracePeriod=30 Mar 07 07:15:53 crc kubenswrapper[4941]: I0307 07:15:53.022279 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b96d8945-3e4c-4bb2-9fbc-383b647cb27d" containerName="nova-api-api" containerID="cri-o://77238f398d9cee130b0d8df3e854f06a7d3561160a1d4734d8cb9fe5c49a2ef4" gracePeriod=30 Mar 07 07:15:53 crc kubenswrapper[4941]: I0307 07:15:53.035487 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:15:53 crc kubenswrapper[4941]: I0307 07:15:53.035922 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="04b46cb5-1e5f-49ac-9852-ebb562330737" containerName="nova-scheduler-scheduler" containerID="cri-o://831214eaf89a9fcbe1ed8c8feb08de8789226d181d7192eb68362e3d028092ff" gracePeriod=30 Mar 07 07:15:53 crc kubenswrapper[4941]: I0307 07:15:53.047735 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:15:53 crc kubenswrapper[4941]: E0307 07:15:53.115519 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="831214eaf89a9fcbe1ed8c8feb08de8789226d181d7192eb68362e3d028092ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 07:15:53 crc kubenswrapper[4941]: E0307 07:15:53.118622 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="831214eaf89a9fcbe1ed8c8feb08de8789226d181d7192eb68362e3d028092ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 07:15:53 crc kubenswrapper[4941]: E0307 07:15:53.120915 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="831214eaf89a9fcbe1ed8c8feb08de8789226d181d7192eb68362e3d028092ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 07:15:53 crc kubenswrapper[4941]: E0307 07:15:53.120956 4941 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="04b46cb5-1e5f-49ac-9852-ebb562330737" containerName="nova-scheduler-scheduler" Mar 07 07:15:53 crc kubenswrapper[4941]: I0307 07:15:53.804233 4941 generic.go:334] "Generic (PLEG): container finished" podID="b96d8945-3e4c-4bb2-9fbc-383b647cb27d" containerID="2fc64ffb161a17f8b6fc7ab6d0b9b60be88be7f217fa7c46acc2f088fc9117fb" exitCode=143 Mar 07 07:15:53 crc kubenswrapper[4941]: I0307 07:15:53.804309 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b96d8945-3e4c-4bb2-9fbc-383b647cb27d","Type":"ContainerDied","Data":"2fc64ffb161a17f8b6fc7ab6d0b9b60be88be7f217fa7c46acc2f088fc9117fb"} Mar 07 07:15:53 crc kubenswrapper[4941]: I0307 07:15:53.812002 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 07 07:15:54 crc kubenswrapper[4941]: I0307 07:15:54.812474 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerName="nova-metadata-log" containerID="cri-o://039e99f324008f83778a7c05ed45ce8a039837cc1194b4d5636e92f100f35b8d" gracePeriod=30 Mar 07 07:15:54 crc kubenswrapper[4941]: I0307 07:15:54.812593 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerName="nova-metadata-metadata" containerID="cri-o://38e8e4193196c14a7e10d291fea5edff967e237026d299d5f89cf712cfe271b3" gracePeriod=30 Mar 07 07:15:55 crc kubenswrapper[4941]: I0307 07:15:55.825463 4941 generic.go:334] "Generic (PLEG): container finished" podID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerID="039e99f324008f83778a7c05ed45ce8a039837cc1194b4d5636e92f100f35b8d" exitCode=143 Mar 07 07:15:55 crc kubenswrapper[4941]: I0307 07:15:55.825568 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52e4233a-6075-4eab-8cc4-e1fa1e892931","Type":"ContainerDied","Data":"039e99f324008f83778a7c05ed45ce8a039837cc1194b4d5636e92f100f35b8d"} Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.630936 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.727571 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-internal-tls-certs\") pod \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.727659 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-combined-ca-bundle\") pod \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.727739 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5h9p\" (UniqueName: \"kubernetes.io/projected/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-kube-api-access-t5h9p\") pod \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.727812 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-public-tls-certs\") pod \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.727845 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-logs\") pod \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.727925 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-config-data\") pod \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\" (UID: \"b96d8945-3e4c-4bb2-9fbc-383b647cb27d\") " Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.728469 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-logs" (OuterVolumeSpecName: "logs") pod "b96d8945-3e4c-4bb2-9fbc-383b647cb27d" (UID: "b96d8945-3e4c-4bb2-9fbc-383b647cb27d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.733219 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-kube-api-access-t5h9p" (OuterVolumeSpecName: "kube-api-access-t5h9p") pod "b96d8945-3e4c-4bb2-9fbc-383b647cb27d" (UID: "b96d8945-3e4c-4bb2-9fbc-383b647cb27d"). InnerVolumeSpecName "kube-api-access-t5h9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.755528 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b96d8945-3e4c-4bb2-9fbc-383b647cb27d" (UID: "b96d8945-3e4c-4bb2-9fbc-383b647cb27d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.757397 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-config-data" (OuterVolumeSpecName: "config-data") pod "b96d8945-3e4c-4bb2-9fbc-383b647cb27d" (UID: "b96d8945-3e4c-4bb2-9fbc-383b647cb27d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.783506 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b96d8945-3e4c-4bb2-9fbc-383b647cb27d" (UID: "b96d8945-3e4c-4bb2-9fbc-383b647cb27d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.804424 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b96d8945-3e4c-4bb2-9fbc-383b647cb27d" (UID: "b96d8945-3e4c-4bb2-9fbc-383b647cb27d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.829825 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.829862 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5h9p\" (UniqueName: \"kubernetes.io/projected/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-kube-api-access-t5h9p\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.829875 4941 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.829887 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.829899 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.829913 4941 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b96d8945-3e4c-4bb2-9fbc-383b647cb27d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.836558 4941 generic.go:334] "Generic (PLEG): container finished" podID="04b46cb5-1e5f-49ac-9852-ebb562330737" containerID="831214eaf89a9fcbe1ed8c8feb08de8789226d181d7192eb68362e3d028092ff" exitCode=0 Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.836628 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04b46cb5-1e5f-49ac-9852-ebb562330737","Type":"ContainerDied","Data":"831214eaf89a9fcbe1ed8c8feb08de8789226d181d7192eb68362e3d028092ff"} Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.838730 4941 generic.go:334] "Generic (PLEG): container finished" podID="b96d8945-3e4c-4bb2-9fbc-383b647cb27d" containerID="77238f398d9cee130b0d8df3e854f06a7d3561160a1d4734d8cb9fe5c49a2ef4" exitCode=0 Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.838803 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b96d8945-3e4c-4bb2-9fbc-383b647cb27d","Type":"ContainerDied","Data":"77238f398d9cee130b0d8df3e854f06a7d3561160a1d4734d8cb9fe5c49a2ef4"} Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.838839 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b96d8945-3e4c-4bb2-9fbc-383b647cb27d","Type":"ContainerDied","Data":"f6707638e25925b40ba89ccb7b0497c05c629790e4ef92c5ff04a2d63d7c90d1"} Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.838857 4941 scope.go:117] "RemoveContainer" containerID="77238f398d9cee130b0d8df3e854f06a7d3561160a1d4734d8cb9fe5c49a2ef4" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.839004 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.909175 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.912713 4941 scope.go:117] "RemoveContainer" containerID="2fc64ffb161a17f8b6fc7ab6d0b9b60be88be7f217fa7c46acc2f088fc9117fb" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.921718 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.935250 4941 scope.go:117] "RemoveContainer" containerID="77238f398d9cee130b0d8df3e854f06a7d3561160a1d4734d8cb9fe5c49a2ef4" Mar 07 07:15:56 crc kubenswrapper[4941]: E0307 07:15:56.935828 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77238f398d9cee130b0d8df3e854f06a7d3561160a1d4734d8cb9fe5c49a2ef4\": container with ID starting with 77238f398d9cee130b0d8df3e854f06a7d3561160a1d4734d8cb9fe5c49a2ef4 not found: ID does not exist" containerID="77238f398d9cee130b0d8df3e854f06a7d3561160a1d4734d8cb9fe5c49a2ef4" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.935904 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77238f398d9cee130b0d8df3e854f06a7d3561160a1d4734d8cb9fe5c49a2ef4"} err="failed to get container status \"77238f398d9cee130b0d8df3e854f06a7d3561160a1d4734d8cb9fe5c49a2ef4\": rpc error: code = NotFound desc = could not find container \"77238f398d9cee130b0d8df3e854f06a7d3561160a1d4734d8cb9fe5c49a2ef4\": container with ID starting with 77238f398d9cee130b0d8df3e854f06a7d3561160a1d4734d8cb9fe5c49a2ef4 not found: ID does not exist" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.935946 4941 scope.go:117] "RemoveContainer" containerID="2fc64ffb161a17f8b6fc7ab6d0b9b60be88be7f217fa7c46acc2f088fc9117fb" Mar 07 07:15:56 crc kubenswrapper[4941]: E0307 07:15:56.938714 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc64ffb161a17f8b6fc7ab6d0b9b60be88be7f217fa7c46acc2f088fc9117fb\": container with ID starting with 2fc64ffb161a17f8b6fc7ab6d0b9b60be88be7f217fa7c46acc2f088fc9117fb not found: ID does not exist" containerID="2fc64ffb161a17f8b6fc7ab6d0b9b60be88be7f217fa7c46acc2f088fc9117fb" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.938771 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc64ffb161a17f8b6fc7ab6d0b9b60be88be7f217fa7c46acc2f088fc9117fb"} err="failed to get container status \"2fc64ffb161a17f8b6fc7ab6d0b9b60be88be7f217fa7c46acc2f088fc9117fb\": rpc error: code = NotFound desc = could not find container \"2fc64ffb161a17f8b6fc7ab6d0b9b60be88be7f217fa7c46acc2f088fc9117fb\": container with ID starting with 2fc64ffb161a17f8b6fc7ab6d0b9b60be88be7f217fa7c46acc2f088fc9117fb not found: ID does not exist" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.962161 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:56 crc kubenswrapper[4941]: E0307 07:15:56.962899 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b96d8945-3e4c-4bb2-9fbc-383b647cb27d" containerName="nova-api-log" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.962920 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96d8945-3e4c-4bb2-9fbc-383b647cb27d" containerName="nova-api-log" Mar 07 07:15:56 crc kubenswrapper[4941]: E0307 07:15:56.962935 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef55d7b-622e-4660-bc2a-990353dae291" containerName="nova-manage" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.962943 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef55d7b-622e-4660-bc2a-990353dae291" containerName="nova-manage" Mar 07 07:15:56 crc kubenswrapper[4941]: E0307 07:15:56.962989 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b96d8945-3e4c-4bb2-9fbc-383b647cb27d" containerName="nova-api-api" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.962996 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96d8945-3e4c-4bb2-9fbc-383b647cb27d" containerName="nova-api-api" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.963324 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef55d7b-622e-4660-bc2a-990353dae291" containerName="nova-manage" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.963352 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="b96d8945-3e4c-4bb2-9fbc-383b647cb27d" containerName="nova-api-log" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.963383 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="b96d8945-3e4c-4bb2-9fbc-383b647cb27d" containerName="nova-api-api" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.971017 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.973743 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.973826 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.975719 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 07 07:15:56 crc kubenswrapper[4941]: I0307 07:15:56.980811 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.044028 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.044125 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-logs\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.044222 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.044264 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-config-data\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.044282 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.044310 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6k8n\" (UniqueName: \"kubernetes.io/projected/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-kube-api-access-s6k8n\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.053939 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.145235 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b46cb5-1e5f-49ac-9852-ebb562330737-combined-ca-bundle\") pod \"04b46cb5-1e5f-49ac-9852-ebb562330737\" (UID: \"04b46cb5-1e5f-49ac-9852-ebb562330737\") " Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.145879 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfcfz\" (UniqueName: \"kubernetes.io/projected/04b46cb5-1e5f-49ac-9852-ebb562330737-kube-api-access-tfcfz\") pod \"04b46cb5-1e5f-49ac-9852-ebb562330737\" (UID: \"04b46cb5-1e5f-49ac-9852-ebb562330737\") " Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.145931 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b46cb5-1e5f-49ac-9852-ebb562330737-config-data\") pod \"04b46cb5-1e5f-49ac-9852-ebb562330737\" (UID: \"04b46cb5-1e5f-49ac-9852-ebb562330737\") " Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.146258 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.146324 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-config-data\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.146357 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.146383 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6k8n\" (UniqueName: \"kubernetes.io/projected/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-kube-api-access-s6k8n\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.146492 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.146540 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-logs\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.147168 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-logs\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.150434 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b46cb5-1e5f-49ac-9852-ebb562330737-kube-api-access-tfcfz" (OuterVolumeSpecName: "kube-api-access-tfcfz") pod "04b46cb5-1e5f-49ac-9852-ebb562330737" (UID: "04b46cb5-1e5f-49ac-9852-ebb562330737"). InnerVolumeSpecName "kube-api-access-tfcfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.150728 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.153220 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.153266 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-config-data\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.155005 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.163575 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6k8n\" (UniqueName: \"kubernetes.io/projected/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-kube-api-access-s6k8n\") pod \"nova-api-0\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.174435 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b46cb5-1e5f-49ac-9852-ebb562330737-config-data" (OuterVolumeSpecName: "config-data") pod "04b46cb5-1e5f-49ac-9852-ebb562330737" (UID: "04b46cb5-1e5f-49ac-9852-ebb562330737"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.184539 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b46cb5-1e5f-49ac-9852-ebb562330737-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04b46cb5-1e5f-49ac-9852-ebb562330737" (UID: "04b46cb5-1e5f-49ac-9852-ebb562330737"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.248171 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfcfz\" (UniqueName: \"kubernetes.io/projected/04b46cb5-1e5f-49ac-9852-ebb562330737-kube-api-access-tfcfz\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.248205 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b46cb5-1e5f-49ac-9852-ebb562330737-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.248217 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b46cb5-1e5f-49ac-9852-ebb562330737-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.292225 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.727294 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:15:57 crc kubenswrapper[4941]: W0307 07:15:57.739100 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0b4aa9b_9dc9_4e99_87e8_6320c5a81456.slice/crio-7d06f9c8e232fe9e1561f48bec2c8f98d10b80ac8b6c45382a1fe11a80ade1c0 WatchSource:0}: Error finding container 7d06f9c8e232fe9e1561f48bec2c8f98d10b80ac8b6c45382a1fe11a80ade1c0: Status 404 returned error can't find the container with id 7d06f9c8e232fe9e1561f48bec2c8f98d10b80ac8b6c45382a1fe11a80ade1c0 Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.850079 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04b46cb5-1e5f-49ac-9852-ebb562330737","Type":"ContainerDied","Data":"018406604e00a3ab34b3d05cf12f923b337ec47ec05a92229861873150941da8"} Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.850115 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.850435 4941 scope.go:117] "RemoveContainer" containerID="831214eaf89a9fcbe1ed8c8feb08de8789226d181d7192eb68362e3d028092ff" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.852149 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456","Type":"ContainerStarted","Data":"7d06f9c8e232fe9e1561f48bec2c8f98d10b80ac8b6c45382a1fe11a80ade1c0"} Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.909707 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.916105 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": dial tcp 10.217.0.202:8775: connect: connection refused" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.916509 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": dial tcp 10.217.0.202:8775: connect: connection refused" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.926682 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.941969 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:15:57 crc kubenswrapper[4941]: E0307 07:15:57.942395 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b46cb5-1e5f-49ac-9852-ebb562330737" containerName="nova-scheduler-scheduler" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.942424 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b46cb5-1e5f-49ac-9852-ebb562330737" containerName="nova-scheduler-scheduler" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.942610 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b46cb5-1e5f-49ac-9852-ebb562330737" containerName="nova-scheduler-scheduler" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.943253 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.945257 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.971323 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b46cb5-1e5f-49ac-9852-ebb562330737" path="/var/lib/kubelet/pods/04b46cb5-1e5f-49ac-9852-ebb562330737/volumes" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.972231 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b96d8945-3e4c-4bb2-9fbc-383b647cb27d" path="/var/lib/kubelet/pods/b96d8945-3e4c-4bb2-9fbc-383b647cb27d/volumes" Mar 07 07:15:57 crc kubenswrapper[4941]: I0307 07:15:57.973027 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.065674 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.065997 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwmdm\" (UniqueName: \"kubernetes.io/projected/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-kube-api-access-hwmdm\") pod \"nova-scheduler-0\" (UID: \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.066178 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-config-data\") pod \"nova-scheduler-0\" (UID: \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.167847 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-config-data\") pod \"nova-scheduler-0\" (UID: \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.168282 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.168350 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwmdm\" (UniqueName: \"kubernetes.io/projected/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-kube-api-access-hwmdm\") pod \"nova-scheduler-0\" (UID: \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.172373 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-config-data\") pod \"nova-scheduler-0\" (UID: \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.172752 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.184706 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwmdm\" (UniqueName: \"kubernetes.io/projected/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-kube-api-access-hwmdm\") pod \"nova-scheduler-0\" (UID: \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\") " pod="openstack/nova-scheduler-0" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.239773 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.292578 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.378240 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-combined-ca-bundle\") pod \"52e4233a-6075-4eab-8cc4-e1fa1e892931\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.378332 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-nova-metadata-tls-certs\") pod \"52e4233a-6075-4eab-8cc4-e1fa1e892931\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.382378 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-config-data\") pod \"52e4233a-6075-4eab-8cc4-e1fa1e892931\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.382580 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx8gh\" (UniqueName: \"kubernetes.io/projected/52e4233a-6075-4eab-8cc4-e1fa1e892931-kube-api-access-cx8gh\") pod \"52e4233a-6075-4eab-8cc4-e1fa1e892931\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.382652 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e4233a-6075-4eab-8cc4-e1fa1e892931-logs\") pod \"52e4233a-6075-4eab-8cc4-e1fa1e892931\" (UID: \"52e4233a-6075-4eab-8cc4-e1fa1e892931\") " Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.384051 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52e4233a-6075-4eab-8cc4-e1fa1e892931-logs" (OuterVolumeSpecName: "logs") pod "52e4233a-6075-4eab-8cc4-e1fa1e892931" (UID: "52e4233a-6075-4eab-8cc4-e1fa1e892931"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.417602 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e4233a-6075-4eab-8cc4-e1fa1e892931-kube-api-access-cx8gh" (OuterVolumeSpecName: "kube-api-access-cx8gh") pod "52e4233a-6075-4eab-8cc4-e1fa1e892931" (UID: "52e4233a-6075-4eab-8cc4-e1fa1e892931"). InnerVolumeSpecName "kube-api-access-cx8gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.424306 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52e4233a-6075-4eab-8cc4-e1fa1e892931" (UID: "52e4233a-6075-4eab-8cc4-e1fa1e892931"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.434097 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-config-data" (OuterVolumeSpecName: "config-data") pod "52e4233a-6075-4eab-8cc4-e1fa1e892931" (UID: "52e4233a-6075-4eab-8cc4-e1fa1e892931"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.455717 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "52e4233a-6075-4eab-8cc4-e1fa1e892931" (UID: "52e4233a-6075-4eab-8cc4-e1fa1e892931"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.484814 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.484847 4941 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.484857 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e4233a-6075-4eab-8cc4-e1fa1e892931-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.484865 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx8gh\" (UniqueName: \"kubernetes.io/projected/52e4233a-6075-4eab-8cc4-e1fa1e892931-kube-api-access-cx8gh\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.484873 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e4233a-6075-4eab-8cc4-e1fa1e892931-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.786951 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.866225 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456","Type":"ContainerStarted","Data":"8fd831041666d61389790739f7bec118187287e501eddb0ae0421e7a9df1ec85"} Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.866269 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456","Type":"ContainerStarted","Data":"e0bc59efb00016de1cf1533c35c568041ed4eee0c8728d1661b2d85f7921e10f"} Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.867816 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05a91fa3-14f1-4d15-bdfc-bb1fc310a913","Type":"ContainerStarted","Data":"1fcf3ac7428cb9807e710cd2d12bcc87bfcc402eadda52d1e20b5646b848c350"} Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.871225 4941 generic.go:334] "Generic (PLEG): container finished" podID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerID="38e8e4193196c14a7e10d291fea5edff967e237026d299d5f89cf712cfe271b3" exitCode=0 Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.871266 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52e4233a-6075-4eab-8cc4-e1fa1e892931","Type":"ContainerDied","Data":"38e8e4193196c14a7e10d291fea5edff967e237026d299d5f89cf712cfe271b3"} Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.871293 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52e4233a-6075-4eab-8cc4-e1fa1e892931","Type":"ContainerDied","Data":"46085642a25947b7edcc993722705214c8a8d1e89ce615a82fe352c445f5a486"} Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.871311 4941 scope.go:117] "RemoveContainer" containerID="38e8e4193196c14a7e10d291fea5edff967e237026d299d5f89cf712cfe271b3" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.871316 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.891716 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8916950850000003 podStartE2EDuration="2.891695085s" podCreationTimestamp="2026-03-07 07:15:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:15:58.887376396 +0000 UTC m=+1455.839741871" watchObservedRunningTime="2026-03-07 07:15:58.891695085 +0000 UTC m=+1455.844060550" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.897787 4941 scope.go:117] "RemoveContainer" containerID="039e99f324008f83778a7c05ed45ce8a039837cc1194b4d5636e92f100f35b8d" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.925136 4941 scope.go:117] "RemoveContainer" containerID="38e8e4193196c14a7e10d291fea5edff967e237026d299d5f89cf712cfe271b3" Mar 07 07:15:58 crc kubenswrapper[4941]: E0307 07:15:58.925513 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e8e4193196c14a7e10d291fea5edff967e237026d299d5f89cf712cfe271b3\": container with ID starting with 38e8e4193196c14a7e10d291fea5edff967e237026d299d5f89cf712cfe271b3 not found: ID does not exist" containerID="38e8e4193196c14a7e10d291fea5edff967e237026d299d5f89cf712cfe271b3" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.925538 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e8e4193196c14a7e10d291fea5edff967e237026d299d5f89cf712cfe271b3"} err="failed to get container status \"38e8e4193196c14a7e10d291fea5edff967e237026d299d5f89cf712cfe271b3\": rpc error: code = NotFound desc = could not find container \"38e8e4193196c14a7e10d291fea5edff967e237026d299d5f89cf712cfe271b3\": container with ID starting with 38e8e4193196c14a7e10d291fea5edff967e237026d299d5f89cf712cfe271b3 not found: ID does not exist" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.925555 4941 scope.go:117] "RemoveContainer" containerID="039e99f324008f83778a7c05ed45ce8a039837cc1194b4d5636e92f100f35b8d" Mar 07 07:15:58 crc kubenswrapper[4941]: E0307 07:15:58.927118 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"039e99f324008f83778a7c05ed45ce8a039837cc1194b4d5636e92f100f35b8d\": container with ID starting with 039e99f324008f83778a7c05ed45ce8a039837cc1194b4d5636e92f100f35b8d not found: ID does not exist" containerID="039e99f324008f83778a7c05ed45ce8a039837cc1194b4d5636e92f100f35b8d" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.927142 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"039e99f324008f83778a7c05ed45ce8a039837cc1194b4d5636e92f100f35b8d"} err="failed to get container status \"039e99f324008f83778a7c05ed45ce8a039837cc1194b4d5636e92f100f35b8d\": rpc error: code = NotFound desc = could not find container \"039e99f324008f83778a7c05ed45ce8a039837cc1194b4d5636e92f100f35b8d\": container with ID starting with 039e99f324008f83778a7c05ed45ce8a039837cc1194b4d5636e92f100f35b8d not found: ID does not exist" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.935135 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.952150 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.963691 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:15:58 crc kubenswrapper[4941]: E0307 07:15:58.964127 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerName="nova-metadata-log" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.964144 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerName="nova-metadata-log" Mar 07 07:15:58 crc kubenswrapper[4941]: E0307 07:15:58.964233 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerName="nova-metadata-metadata" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.964243 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerName="nova-metadata-metadata" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.964506 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerName="nova-metadata-log" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.964531 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e4233a-6075-4eab-8cc4-e1fa1e892931" containerName="nova-metadata-metadata" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.966018 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.967788 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.968866 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 07 07:15:58 crc kubenswrapper[4941]: I0307 07:15:58.997740 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.095604 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.095686 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.095733 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-config-data\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.095756 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzkch\" (UniqueName: \"kubernetes.io/projected/c892cbf7-126c-4638-854d-18cef63c7747-kube-api-access-wzkch\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.095887 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c892cbf7-126c-4638-854d-18cef63c7747-logs\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.197334 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c892cbf7-126c-4638-854d-18cef63c7747-logs\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.197425 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.197487 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.197530 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-config-data\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.197554 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzkch\" (UniqueName: \"kubernetes.io/projected/c892cbf7-126c-4638-854d-18cef63c7747-kube-api-access-wzkch\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.198785 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c892cbf7-126c-4638-854d-18cef63c7747-logs\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.201391 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.201547 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.212678 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-config-data\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.216867 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzkch\" (UniqueName: \"kubernetes.io/projected/c892cbf7-126c-4638-854d-18cef63c7747-kube-api-access-wzkch\") pod \"nova-metadata-0\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.296528 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.791468 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.884161 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05a91fa3-14f1-4d15-bdfc-bb1fc310a913","Type":"ContainerStarted","Data":"ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd"} Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.885245 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c892cbf7-126c-4638-854d-18cef63c7747","Type":"ContainerStarted","Data":"2399780da12c932963c0a5020fc82674c1fb29a3cc71a2e1aafaa5fa7f5f1133"} Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.901992 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.90197671 podStartE2EDuration="2.90197671s" podCreationTimestamp="2026-03-07 07:15:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:15:59.899582249 +0000 UTC m=+1456.851947724" watchObservedRunningTime="2026-03-07 07:15:59.90197671 +0000 UTC m=+1456.854342175" Mar 07 07:15:59 crc kubenswrapper[4941]: I0307 07:15:59.973104 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52e4233a-6075-4eab-8cc4-e1fa1e892931" path="/var/lib/kubelet/pods/52e4233a-6075-4eab-8cc4-e1fa1e892931/volumes" Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.146703 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547796-5ncn7"] Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.148763 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547796-5ncn7" Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.151290 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.151585 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.151971 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.156289 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547796-5ncn7"] Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.214158 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rzz2\" (UniqueName: \"kubernetes.io/projected/5923db9b-eb36-4b13-a0e5-78ba9c2017b1-kube-api-access-8rzz2\") pod \"auto-csr-approver-29547796-5ncn7\" (UID: \"5923db9b-eb36-4b13-a0e5-78ba9c2017b1\") " pod="openshift-infra/auto-csr-approver-29547796-5ncn7" Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.315885 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rzz2\" (UniqueName: \"kubernetes.io/projected/5923db9b-eb36-4b13-a0e5-78ba9c2017b1-kube-api-access-8rzz2\") pod \"auto-csr-approver-29547796-5ncn7\" (UID: \"5923db9b-eb36-4b13-a0e5-78ba9c2017b1\") " pod="openshift-infra/auto-csr-approver-29547796-5ncn7" Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.345258 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rzz2\" (UniqueName: \"kubernetes.io/projected/5923db9b-eb36-4b13-a0e5-78ba9c2017b1-kube-api-access-8rzz2\") pod \"auto-csr-approver-29547796-5ncn7\" (UID: \"5923db9b-eb36-4b13-a0e5-78ba9c2017b1\") " pod="openshift-infra/auto-csr-approver-29547796-5ncn7" Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.467665 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547796-5ncn7" Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.898573 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c892cbf7-126c-4638-854d-18cef63c7747","Type":"ContainerStarted","Data":"dcc3d4c395eb430f79b4594643474a932cbc5f6574ab24c67ad701a47825619a"} Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.898916 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c892cbf7-126c-4638-854d-18cef63c7747","Type":"ContainerStarted","Data":"bbeb9946a442f98e015fb840554d04da471185075dcd4f2981db1c33d0175b7d"} Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.903695 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547796-5ncn7"] Mar 07 07:16:00 crc kubenswrapper[4941]: I0307 07:16:00.934944 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.934922416 podStartE2EDuration="2.934922416s" podCreationTimestamp="2026-03-07 07:15:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:16:00.922370929 +0000 UTC m=+1457.874736404" watchObservedRunningTime="2026-03-07 07:16:00.934922416 +0000 UTC m=+1457.887287881" Mar 07 07:16:01 crc kubenswrapper[4941]: I0307 07:16:01.914674 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547796-5ncn7" event={"ID":"5923db9b-eb36-4b13-a0e5-78ba9c2017b1","Type":"ContainerStarted","Data":"735e3bc0a632aa6cc6a53bf7a936981f1b74c8d053b601f2a11991e46b5a133b"} Mar 07 07:16:02 crc kubenswrapper[4941]: I0307 07:16:02.933320 4941 generic.go:334] "Generic (PLEG): container finished" podID="5923db9b-eb36-4b13-a0e5-78ba9c2017b1" containerID="9ba176036dee7f5c9afc1e0e8e5a89d8cb4fcf2342760faaa6466e404e2b45ac" exitCode=0 Mar 07 07:16:02 crc kubenswrapper[4941]: I0307 07:16:02.933445 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547796-5ncn7" event={"ID":"5923db9b-eb36-4b13-a0e5-78ba9c2017b1","Type":"ContainerDied","Data":"9ba176036dee7f5c9afc1e0e8e5a89d8cb4fcf2342760faaa6466e404e2b45ac"} Mar 07 07:16:03 crc kubenswrapper[4941]: I0307 07:16:03.293772 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 07 07:16:04 crc kubenswrapper[4941]: I0307 07:16:04.263470 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547796-5ncn7" Mar 07 07:16:04 crc kubenswrapper[4941]: I0307 07:16:04.297014 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 07 07:16:04 crc kubenswrapper[4941]: I0307 07:16:04.297080 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 07 07:16:04 crc kubenswrapper[4941]: I0307 07:16:04.307250 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rzz2\" (UniqueName: \"kubernetes.io/projected/5923db9b-eb36-4b13-a0e5-78ba9c2017b1-kube-api-access-8rzz2\") pod \"5923db9b-eb36-4b13-a0e5-78ba9c2017b1\" (UID: \"5923db9b-eb36-4b13-a0e5-78ba9c2017b1\") " Mar 07 07:16:04 crc kubenswrapper[4941]: I0307 07:16:04.314057 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5923db9b-eb36-4b13-a0e5-78ba9c2017b1-kube-api-access-8rzz2" (OuterVolumeSpecName: "kube-api-access-8rzz2") pod "5923db9b-eb36-4b13-a0e5-78ba9c2017b1" (UID: "5923db9b-eb36-4b13-a0e5-78ba9c2017b1"). InnerVolumeSpecName "kube-api-access-8rzz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:16:04 crc kubenswrapper[4941]: I0307 07:16:04.410170 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rzz2\" (UniqueName: \"kubernetes.io/projected/5923db9b-eb36-4b13-a0e5-78ba9c2017b1-kube-api-access-8rzz2\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:04 crc kubenswrapper[4941]: I0307 07:16:04.954428 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547796-5ncn7" Mar 07 07:16:04 crc kubenswrapper[4941]: I0307 07:16:04.954422 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547796-5ncn7" event={"ID":"5923db9b-eb36-4b13-a0e5-78ba9c2017b1","Type":"ContainerDied","Data":"735e3bc0a632aa6cc6a53bf7a936981f1b74c8d053b601f2a11991e46b5a133b"} Mar 07 07:16:04 crc kubenswrapper[4941]: I0307 07:16:04.954870 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="735e3bc0a632aa6cc6a53bf7a936981f1b74c8d053b601f2a11991e46b5a133b" Mar 07 07:16:05 crc kubenswrapper[4941]: I0307 07:16:05.340323 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547790-5jgkv"] Mar 07 07:16:05 crc kubenswrapper[4941]: I0307 07:16:05.351678 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547790-5jgkv"] Mar 07 07:16:05 crc kubenswrapper[4941]: I0307 07:16:05.965213 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bd90b16-b290-4cce-bf50-05e5c4f30a48" path="/var/lib/kubelet/pods/5bd90b16-b290-4cce-bf50-05e5c4f30a48/volumes" Mar 07 07:16:07 crc kubenswrapper[4941]: I0307 07:16:07.293635 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 07:16:07 crc kubenswrapper[4941]: I0307 07:16:07.294014 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 07:16:08 crc kubenswrapper[4941]: I0307 07:16:08.292994 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 07 07:16:08 crc kubenswrapper[4941]: I0307 07:16:08.300636 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 07:16:08 crc kubenswrapper[4941]: I0307 07:16:08.307611 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 07:16:08 crc kubenswrapper[4941]: I0307 07:16:08.327485 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 07 07:16:08 crc kubenswrapper[4941]: I0307 07:16:08.967308 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 07 07:16:09 crc kubenswrapper[4941]: I0307 07:16:09.070073 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 07 07:16:09 crc kubenswrapper[4941]: I0307 07:16:09.299105 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 07 07:16:09 crc kubenswrapper[4941]: I0307 07:16:09.299218 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 07 07:16:10 crc kubenswrapper[4941]: I0307 07:16:10.302570 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c892cbf7-126c-4638-854d-18cef63c7747" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 07:16:10 crc kubenswrapper[4941]: I0307 07:16:10.306580 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c892cbf7-126c-4638-854d-18cef63c7747" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 07:16:12 crc kubenswrapper[4941]: I0307 07:16:12.420325 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 07 07:16:12 crc kubenswrapper[4941]: I0307 07:16:12.421107 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="7b8305a8-370d-4b70-8807-e0188603429f" containerName="kube-state-metrics" containerID="cri-o://d9198001c28a9e1be27e0ffe75994ebbba017f278186889f54463acfe87c9367" gracePeriod=30 Mar 07 07:16:12 crc kubenswrapper[4941]: I0307 07:16:12.864096 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 07 07:16:12 crc kubenswrapper[4941]: I0307 07:16:12.998739 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt88n\" (UniqueName: \"kubernetes.io/projected/7b8305a8-370d-4b70-8807-e0188603429f-kube-api-access-rt88n\") pod \"7b8305a8-370d-4b70-8807-e0188603429f\" (UID: \"7b8305a8-370d-4b70-8807-e0188603429f\") " Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.005665 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b8305a8-370d-4b70-8807-e0188603429f-kube-api-access-rt88n" (OuterVolumeSpecName: "kube-api-access-rt88n") pod "7b8305a8-370d-4b70-8807-e0188603429f" (UID: "7b8305a8-370d-4b70-8807-e0188603429f"). InnerVolumeSpecName "kube-api-access-rt88n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.031781 4941 generic.go:334] "Generic (PLEG): container finished" podID="7b8305a8-370d-4b70-8807-e0188603429f" containerID="d9198001c28a9e1be27e0ffe75994ebbba017f278186889f54463acfe87c9367" exitCode=2 Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.031831 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7b8305a8-370d-4b70-8807-e0188603429f","Type":"ContainerDied","Data":"d9198001c28a9e1be27e0ffe75994ebbba017f278186889f54463acfe87c9367"} Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.031845 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.031860 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7b8305a8-370d-4b70-8807-e0188603429f","Type":"ContainerDied","Data":"c6a9c58d2981c0f4c12aff61b0a5c960da7136612c3a3ff3f65574c179cd7352"} Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.031881 4941 scope.go:117] "RemoveContainer" containerID="d9198001c28a9e1be27e0ffe75994ebbba017f278186889f54463acfe87c9367" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.090265 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.100854 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt88n\" (UniqueName: \"kubernetes.io/projected/7b8305a8-370d-4b70-8807-e0188603429f-kube-api-access-rt88n\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.102825 4941 scope.go:117] "RemoveContainer" containerID="d9198001c28a9e1be27e0ffe75994ebbba017f278186889f54463acfe87c9367" Mar 07 07:16:13 crc kubenswrapper[4941]: E0307 07:16:13.105619 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9198001c28a9e1be27e0ffe75994ebbba017f278186889f54463acfe87c9367\": container with ID starting with d9198001c28a9e1be27e0ffe75994ebbba017f278186889f54463acfe87c9367 not found: ID does not exist" containerID="d9198001c28a9e1be27e0ffe75994ebbba017f278186889f54463acfe87c9367" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.105690 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9198001c28a9e1be27e0ffe75994ebbba017f278186889f54463acfe87c9367"} err="failed to get container status \"d9198001c28a9e1be27e0ffe75994ebbba017f278186889f54463acfe87c9367\": rpc error: code = NotFound desc = could not find container \"d9198001c28a9e1be27e0ffe75994ebbba017f278186889f54463acfe87c9367\": container with ID starting with d9198001c28a9e1be27e0ffe75994ebbba017f278186889f54463acfe87c9367 not found: ID does not exist" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.112688 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.123978 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 07 07:16:13 crc kubenswrapper[4941]: E0307 07:16:13.124554 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5923db9b-eb36-4b13-a0e5-78ba9c2017b1" containerName="oc" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.124575 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="5923db9b-eb36-4b13-a0e5-78ba9c2017b1" containerName="oc" Mar 07 07:16:13 crc kubenswrapper[4941]: E0307 07:16:13.124596 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8305a8-370d-4b70-8807-e0188603429f" containerName="kube-state-metrics" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.124602 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8305a8-370d-4b70-8807-e0188603429f" containerName="kube-state-metrics" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.124785 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="5923db9b-eb36-4b13-a0e5-78ba9c2017b1" containerName="oc" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.124800 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b8305a8-370d-4b70-8807-e0188603429f" containerName="kube-state-metrics" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.125456 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.127391 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.127852 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.137552 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.202001 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdkg5\" (UniqueName: \"kubernetes.io/projected/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-api-access-wdkg5\") pod \"kube-state-metrics-0\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.202075 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.202097 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.202282 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.304026 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdkg5\" (UniqueName: \"kubernetes.io/projected/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-api-access-wdkg5\") pod \"kube-state-metrics-0\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.304091 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.304108 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.304170 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.310533 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.311144 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.316334 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.329082 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdkg5\" (UniqueName: \"kubernetes.io/projected/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-api-access-wdkg5\") pod \"kube-state-metrics-0\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.444186 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.915677 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 07 07:16:13 crc kubenswrapper[4941]: I0307 07:16:13.970704 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b8305a8-370d-4b70-8807-e0188603429f" path="/var/lib/kubelet/pods/7b8305a8-370d-4b70-8807-e0188603429f/volumes" Mar 07 07:16:14 crc kubenswrapper[4941]: I0307 07:16:14.042034 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"74c4b049-d672-41e8-b3cb-09b800f04a19","Type":"ContainerStarted","Data":"30521c90e4212432949f623b8c8a6e0634a9ca40ea2fa873605d323c704b886f"} Mar 07 07:16:14 crc kubenswrapper[4941]: I0307 07:16:14.383013 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:16:14 crc kubenswrapper[4941]: I0307 07:16:14.383326 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="ceilometer-central-agent" containerID="cri-o://361d398e8fc72c653035917cc9b68c1757d8c3c0e6a0df9e4f2f5f42e6a187b4" gracePeriod=30 Mar 07 07:16:14 crc kubenswrapper[4941]: I0307 07:16:14.383365 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="sg-core" containerID="cri-o://e06f7599440d38a4a93fa7deb481a34fe8928621b5a45e53d11879ac203ec92c" gracePeriod=30 Mar 07 07:16:14 crc kubenswrapper[4941]: I0307 07:16:14.383489 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="proxy-httpd" containerID="cri-o://b4f512e0aa7ca8a4ed5bf77ec13dfb0ab712fd25b64fb2f6461583106c2b0186" gracePeriod=30 Mar 07 07:16:14 crc kubenswrapper[4941]: I0307 07:16:14.383464 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="ceilometer-notification-agent" containerID="cri-o://7a36b56ccfdf2599202a3e46368a0bcacd3caccbe64a1360dfc5993ca0555bbd" gracePeriod=30 Mar 07 07:16:15 crc kubenswrapper[4941]: I0307 07:16:15.056308 4941 generic.go:334] "Generic (PLEG): container finished" podID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerID="b4f512e0aa7ca8a4ed5bf77ec13dfb0ab712fd25b64fb2f6461583106c2b0186" exitCode=0 Mar 07 07:16:15 crc kubenswrapper[4941]: I0307 07:16:15.056680 4941 generic.go:334] "Generic (PLEG): container finished" podID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerID="e06f7599440d38a4a93fa7deb481a34fe8928621b5a45e53d11879ac203ec92c" exitCode=2 Mar 07 07:16:15 crc kubenswrapper[4941]: I0307 07:16:15.056694 4941 generic.go:334] "Generic (PLEG): container finished" podID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerID="361d398e8fc72c653035917cc9b68c1757d8c3c0e6a0df9e4f2f5f42e6a187b4" exitCode=0 Mar 07 07:16:15 crc kubenswrapper[4941]: I0307 07:16:15.056337 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3","Type":"ContainerDied","Data":"b4f512e0aa7ca8a4ed5bf77ec13dfb0ab712fd25b64fb2f6461583106c2b0186"} Mar 07 07:16:15 crc kubenswrapper[4941]: I0307 07:16:15.056758 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3","Type":"ContainerDied","Data":"e06f7599440d38a4a93fa7deb481a34fe8928621b5a45e53d11879ac203ec92c"} Mar 07 07:16:15 crc kubenswrapper[4941]: I0307 07:16:15.056773 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3","Type":"ContainerDied","Data":"361d398e8fc72c653035917cc9b68c1757d8c3c0e6a0df9e4f2f5f42e6a187b4"} Mar 07 07:16:15 crc kubenswrapper[4941]: I0307 07:16:15.058264 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"74c4b049-d672-41e8-b3cb-09b800f04a19","Type":"ContainerStarted","Data":"d0ced7486dfed1220f94f3b911f9652e0c4769872e3285a1108e1beb5bef597b"} Mar 07 07:16:15 crc kubenswrapper[4941]: I0307 07:16:15.058410 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 07 07:16:15 crc kubenswrapper[4941]: I0307 07:16:15.077335 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.5917186079999999 podStartE2EDuration="2.077310446s" podCreationTimestamp="2026-03-07 07:16:13 +0000 UTC" firstStartedPulling="2026-03-07 07:16:13.920607586 +0000 UTC m=+1470.872973061" lastFinishedPulling="2026-03-07 07:16:14.406199434 +0000 UTC m=+1471.358564899" observedRunningTime="2026-03-07 07:16:15.073878299 +0000 UTC m=+1472.026243764" watchObservedRunningTime="2026-03-07 07:16:15.077310446 +0000 UTC m=+1472.029675941" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.080301 4941 generic.go:334] "Generic (PLEG): container finished" podID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerID="7a36b56ccfdf2599202a3e46368a0bcacd3caccbe64a1360dfc5993ca0555bbd" exitCode=0 Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.080369 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3","Type":"ContainerDied","Data":"7a36b56ccfdf2599202a3e46368a0bcacd3caccbe64a1360dfc5993ca0555bbd"} Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.299206 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.300516 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.300941 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.305443 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.322832 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.386817 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-log-httpd\") pod \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.386949 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-combined-ca-bundle\") pod \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.386988 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-sg-core-conf-yaml\") pod \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.387026 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lzkl\" (UniqueName: \"kubernetes.io/projected/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-kube-api-access-9lzkl\") pod \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.387041 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-run-httpd\") pod \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.387121 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-config-data\") pod \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.387191 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-scripts\") pod \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\" (UID: \"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3\") " Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.388622 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" (UID: "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.390692 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" (UID: "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.396703 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-scripts" (OuterVolumeSpecName: "scripts") pod "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" (UID: "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.396935 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-kube-api-access-9lzkl" (OuterVolumeSpecName: "kube-api-access-9lzkl") pod "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" (UID: "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3"). InnerVolumeSpecName "kube-api-access-9lzkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.429629 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" (UID: "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.479837 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" (UID: "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.485624 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-config-data" (OuterVolumeSpecName: "config-data") pod "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" (UID: "e3ea6f6c-45a7-4071-8783-25a2fc52e5e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.493368 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.493418 4941 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.493431 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lzkl\" (UniqueName: \"kubernetes.io/projected/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-kube-api-access-9lzkl\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.493444 4941 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.493456 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.493466 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:17 crc kubenswrapper[4941]: I0307 07:16:17.493476 4941 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.103898 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3ea6f6c-45a7-4071-8783-25a2fc52e5e3","Type":"ContainerDied","Data":"33a90ae920faa6a7a10efa1f0537e285f3d0ca62528cdb38070277f156364966"} Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.103970 4941 scope.go:117] "RemoveContainer" containerID="b4f512e0aa7ca8a4ed5bf77ec13dfb0ab712fd25b64fb2f6461583106c2b0186" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.103922 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.104112 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.115651 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.133952 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.141672 4941 scope.go:117] "RemoveContainer" containerID="e06f7599440d38a4a93fa7deb481a34fe8928621b5a45e53d11879ac203ec92c" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.167447 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.181565 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:16:18 crc kubenswrapper[4941]: E0307 07:16:18.182018 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="proxy-httpd" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.182040 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="proxy-httpd" Mar 07 07:16:18 crc kubenswrapper[4941]: E0307 07:16:18.182070 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="sg-core" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.182079 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="sg-core" Mar 07 07:16:18 crc kubenswrapper[4941]: E0307 07:16:18.182095 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="ceilometer-central-agent" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.182103 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="ceilometer-central-agent" Mar 07 07:16:18 crc kubenswrapper[4941]: E0307 07:16:18.182123 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="ceilometer-notification-agent" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.182132 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="ceilometer-notification-agent" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.182371 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="ceilometer-notification-agent" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.182394 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="sg-core" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.182432 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="ceilometer-central-agent" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.182460 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" containerName="proxy-httpd" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.187160 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.196443 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.197270 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.197485 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.197705 4941 scope.go:117] "RemoveContainer" containerID="7a36b56ccfdf2599202a3e46368a0bcacd3caccbe64a1360dfc5993ca0555bbd" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.201307 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.232594 4941 scope.go:117] "RemoveContainer" containerID="361d398e8fc72c653035917cc9b68c1757d8c3c0e6a0df9e4f2f5f42e6a187b4" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.314830 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc5e0ad9-b4e5-4307-a381-3a92092a3240-run-httpd\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.314897 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.314958 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-config-data\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.315012 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.315038 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-scripts\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.315121 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnh29\" (UniqueName: \"kubernetes.io/projected/fc5e0ad9-b4e5-4307-a381-3a92092a3240-kube-api-access-mnh29\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.315157 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc5e0ad9-b4e5-4307-a381-3a92092a3240-log-httpd\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.315191 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.416774 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-scripts\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.416878 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnh29\" (UniqueName: \"kubernetes.io/projected/fc5e0ad9-b4e5-4307-a381-3a92092a3240-kube-api-access-mnh29\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.416908 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc5e0ad9-b4e5-4307-a381-3a92092a3240-log-httpd\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.416932 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.416962 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc5e0ad9-b4e5-4307-a381-3a92092a3240-run-httpd\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.416993 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.417028 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-config-data\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.417064 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.417554 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc5e0ad9-b4e5-4307-a381-3a92092a3240-run-httpd\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.417886 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc5e0ad9-b4e5-4307-a381-3a92092a3240-log-httpd\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.423015 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.423106 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-config-data\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.423020 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-scripts\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.423860 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.425514 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.434364 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnh29\" (UniqueName: \"kubernetes.io/projected/fc5e0ad9-b4e5-4307-a381-3a92092a3240-kube-api-access-mnh29\") pod \"ceilometer-0\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " pod="openstack/ceilometer-0" Mar 07 07:16:18 crc kubenswrapper[4941]: I0307 07:16:18.512753 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:16:19 crc kubenswrapper[4941]: I0307 07:16:19.023688 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:16:19 crc kubenswrapper[4941]: W0307 07:16:19.026292 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc5e0ad9_b4e5_4307_a381_3a92092a3240.slice/crio-48df85e9c8bbe3dacf4a6e9fa2e3e50848014284023c8e98a3b7426f16cf4280 WatchSource:0}: Error finding container 48df85e9c8bbe3dacf4a6e9fa2e3e50848014284023c8e98a3b7426f16cf4280: Status 404 returned error can't find the container with id 48df85e9c8bbe3dacf4a6e9fa2e3e50848014284023c8e98a3b7426f16cf4280 Mar 07 07:16:19 crc kubenswrapper[4941]: I0307 07:16:19.119525 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc5e0ad9-b4e5-4307-a381-3a92092a3240","Type":"ContainerStarted","Data":"48df85e9c8bbe3dacf4a6e9fa2e3e50848014284023c8e98a3b7426f16cf4280"} Mar 07 07:16:19 crc kubenswrapper[4941]: I0307 07:16:19.302317 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 07 07:16:19 crc kubenswrapper[4941]: I0307 07:16:19.304572 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 07 07:16:19 crc kubenswrapper[4941]: I0307 07:16:19.310237 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 07 07:16:19 crc kubenswrapper[4941]: I0307 07:16:19.965125 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3ea6f6c-45a7-4071-8783-25a2fc52e5e3" path="/var/lib/kubelet/pods/e3ea6f6c-45a7-4071-8783-25a2fc52e5e3/volumes" Mar 07 07:16:20 crc kubenswrapper[4941]: I0307 07:16:20.134600 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc5e0ad9-b4e5-4307-a381-3a92092a3240","Type":"ContainerStarted","Data":"6ea28da4f040b551e0f722f361b7d4996299e44365e98a82b4392308db4b8494"} Mar 07 07:16:20 crc kubenswrapper[4941]: I0307 07:16:20.141795 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 07 07:16:21 crc kubenswrapper[4941]: I0307 07:16:21.147245 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc5e0ad9-b4e5-4307-a381-3a92092a3240","Type":"ContainerStarted","Data":"24d5c64686bdb3b51a90c78d352242e68fa608bef152f77db898dddb78032f3a"} Mar 07 07:16:22 crc kubenswrapper[4941]: I0307 07:16:22.157097 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc5e0ad9-b4e5-4307-a381-3a92092a3240","Type":"ContainerStarted","Data":"340bf7be6f7110cc16013f4c5fec8c41a77df854bfe6612ef7ea3b858a27fa5f"} Mar 07 07:16:23 crc kubenswrapper[4941]: I0307 07:16:23.169398 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc5e0ad9-b4e5-4307-a381-3a92092a3240","Type":"ContainerStarted","Data":"7b499e320c5a54cf02f4961762e570d445ed619170c6bc4f7f796075d747290d"} Mar 07 07:16:23 crc kubenswrapper[4941]: I0307 07:16:23.170038 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 07 07:16:23 crc kubenswrapper[4941]: I0307 07:16:23.197938 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.446403262 podStartE2EDuration="5.197918199s" podCreationTimestamp="2026-03-07 07:16:18 +0000 UTC" firstStartedPulling="2026-03-07 07:16:19.030111673 +0000 UTC m=+1475.982477168" lastFinishedPulling="2026-03-07 07:16:22.78162664 +0000 UTC m=+1479.733992105" observedRunningTime="2026-03-07 07:16:23.188265515 +0000 UTC m=+1480.140630980" watchObservedRunningTime="2026-03-07 07:16:23.197918199 +0000 UTC m=+1480.150283664" Mar 07 07:16:23 crc kubenswrapper[4941]: I0307 07:16:23.452421 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 07 07:16:38 crc kubenswrapper[4941]: I0307 07:16:38.362045 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-92d94"] Mar 07 07:16:38 crc kubenswrapper[4941]: I0307 07:16:38.368498 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:38 crc kubenswrapper[4941]: I0307 07:16:38.374692 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-92d94"] Mar 07 07:16:38 crc kubenswrapper[4941]: I0307 07:16:38.413062 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb374283-35b0-40ec-8d09-61a5f49503d3-utilities\") pod \"redhat-operators-92d94\" (UID: \"bb374283-35b0-40ec-8d09-61a5f49503d3\") " pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:38 crc kubenswrapper[4941]: I0307 07:16:38.413146 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb374283-35b0-40ec-8d09-61a5f49503d3-catalog-content\") pod \"redhat-operators-92d94\" (UID: \"bb374283-35b0-40ec-8d09-61a5f49503d3\") " pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:38 crc kubenswrapper[4941]: I0307 07:16:38.413232 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rztl8\" (UniqueName: \"kubernetes.io/projected/bb374283-35b0-40ec-8d09-61a5f49503d3-kube-api-access-rztl8\") pod \"redhat-operators-92d94\" (UID: \"bb374283-35b0-40ec-8d09-61a5f49503d3\") " pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:38 crc kubenswrapper[4941]: I0307 07:16:38.514854 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rztl8\" (UniqueName: \"kubernetes.io/projected/bb374283-35b0-40ec-8d09-61a5f49503d3-kube-api-access-rztl8\") pod \"redhat-operators-92d94\" (UID: \"bb374283-35b0-40ec-8d09-61a5f49503d3\") " pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:38 crc kubenswrapper[4941]: I0307 07:16:38.515017 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb374283-35b0-40ec-8d09-61a5f49503d3-utilities\") pod \"redhat-operators-92d94\" (UID: \"bb374283-35b0-40ec-8d09-61a5f49503d3\") " pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:38 crc kubenswrapper[4941]: I0307 07:16:38.515086 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb374283-35b0-40ec-8d09-61a5f49503d3-catalog-content\") pod \"redhat-operators-92d94\" (UID: \"bb374283-35b0-40ec-8d09-61a5f49503d3\") " pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:38 crc kubenswrapper[4941]: I0307 07:16:38.515597 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb374283-35b0-40ec-8d09-61a5f49503d3-utilities\") pod \"redhat-operators-92d94\" (UID: \"bb374283-35b0-40ec-8d09-61a5f49503d3\") " pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:38 crc kubenswrapper[4941]: I0307 07:16:38.515710 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb374283-35b0-40ec-8d09-61a5f49503d3-catalog-content\") pod \"redhat-operators-92d94\" (UID: \"bb374283-35b0-40ec-8d09-61a5f49503d3\") " pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:38 crc kubenswrapper[4941]: I0307 07:16:38.534323 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rztl8\" (UniqueName: \"kubernetes.io/projected/bb374283-35b0-40ec-8d09-61a5f49503d3-kube-api-access-rztl8\") pod \"redhat-operators-92d94\" (UID: \"bb374283-35b0-40ec-8d09-61a5f49503d3\") " pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:38 crc kubenswrapper[4941]: I0307 07:16:38.705334 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:39 crc kubenswrapper[4941]: I0307 07:16:39.667754 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-92d94"] Mar 07 07:16:40 crc kubenswrapper[4941]: I0307 07:16:40.367239 4941 generic.go:334] "Generic (PLEG): container finished" podID="bb374283-35b0-40ec-8d09-61a5f49503d3" containerID="02944a20c05d56d3479f8f7713f1b62d3cf7bca1024341764ddf393c32ed0753" exitCode=0 Mar 07 07:16:40 crc kubenswrapper[4941]: I0307 07:16:40.367330 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92d94" event={"ID":"bb374283-35b0-40ec-8d09-61a5f49503d3","Type":"ContainerDied","Data":"02944a20c05d56d3479f8f7713f1b62d3cf7bca1024341764ddf393c32ed0753"} Mar 07 07:16:40 crc kubenswrapper[4941]: I0307 07:16:40.367574 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92d94" event={"ID":"bb374283-35b0-40ec-8d09-61a5f49503d3","Type":"ContainerStarted","Data":"f85ea8586bcbda3ed36e38ed67d4fe464b6791da8be1e308dd01aa802c662adf"} Mar 07 07:16:41 crc kubenswrapper[4941]: I0307 07:16:41.383360 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92d94" event={"ID":"bb374283-35b0-40ec-8d09-61a5f49503d3","Type":"ContainerStarted","Data":"e9c63d86ddb04f5568bcde1a34c43c5f8857fc2125ef6a590e1fb1e61c7b9c24"} Mar 07 07:16:42 crc kubenswrapper[4941]: I0307 07:16:42.395067 4941 generic.go:334] "Generic (PLEG): container finished" podID="bb374283-35b0-40ec-8d09-61a5f49503d3" containerID="e9c63d86ddb04f5568bcde1a34c43c5f8857fc2125ef6a590e1fb1e61c7b9c24" exitCode=0 Mar 07 07:16:42 crc kubenswrapper[4941]: I0307 07:16:42.395120 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92d94" event={"ID":"bb374283-35b0-40ec-8d09-61a5f49503d3","Type":"ContainerDied","Data":"e9c63d86ddb04f5568bcde1a34c43c5f8857fc2125ef6a590e1fb1e61c7b9c24"} Mar 07 07:16:43 crc kubenswrapper[4941]: I0307 07:16:43.407296 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92d94" event={"ID":"bb374283-35b0-40ec-8d09-61a5f49503d3","Type":"ContainerStarted","Data":"33945ecd97d3d7b48967e0e92de91b838119ad88a652f6df5b78dfde98860eee"} Mar 07 07:16:43 crc kubenswrapper[4941]: I0307 07:16:43.428572 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-92d94" podStartSLOduration=2.889491507 podStartE2EDuration="5.428555999s" podCreationTimestamp="2026-03-07 07:16:38 +0000 UTC" firstStartedPulling="2026-03-07 07:16:40.368787629 +0000 UTC m=+1497.321153094" lastFinishedPulling="2026-03-07 07:16:42.907852081 +0000 UTC m=+1499.860217586" observedRunningTime="2026-03-07 07:16:43.423751027 +0000 UTC m=+1500.376116522" watchObservedRunningTime="2026-03-07 07:16:43.428555999 +0000 UTC m=+1500.380921464" Mar 07 07:16:48 crc kubenswrapper[4941]: I0307 07:16:48.522572 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 07 07:16:48 crc kubenswrapper[4941]: I0307 07:16:48.705894 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:48 crc kubenswrapper[4941]: I0307 07:16:48.705940 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:49 crc kubenswrapper[4941]: I0307 07:16:49.751232 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-92d94" podUID="bb374283-35b0-40ec-8d09-61a5f49503d3" containerName="registry-server" probeResult="failure" output=< Mar 07 07:16:49 crc kubenswrapper[4941]: timeout: failed to connect service ":50051" within 1s Mar 07 07:16:49 crc kubenswrapper[4941]: > Mar 07 07:16:58 crc kubenswrapper[4941]: I0307 07:16:58.780313 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:58 crc kubenswrapper[4941]: I0307 07:16:58.857632 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:16:59 crc kubenswrapper[4941]: I0307 07:16:59.034235 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-92d94"] Mar 07 07:17:00 crc kubenswrapper[4941]: I0307 07:17:00.596790 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-92d94" podUID="bb374283-35b0-40ec-8d09-61a5f49503d3" containerName="registry-server" containerID="cri-o://33945ecd97d3d7b48967e0e92de91b838119ad88a652f6df5b78dfde98860eee" gracePeriod=2 Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.021634 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.176961 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rztl8\" (UniqueName: \"kubernetes.io/projected/bb374283-35b0-40ec-8d09-61a5f49503d3-kube-api-access-rztl8\") pod \"bb374283-35b0-40ec-8d09-61a5f49503d3\" (UID: \"bb374283-35b0-40ec-8d09-61a5f49503d3\") " Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.177142 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb374283-35b0-40ec-8d09-61a5f49503d3-catalog-content\") pod \"bb374283-35b0-40ec-8d09-61a5f49503d3\" (UID: \"bb374283-35b0-40ec-8d09-61a5f49503d3\") " Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.177257 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb374283-35b0-40ec-8d09-61a5f49503d3-utilities\") pod \"bb374283-35b0-40ec-8d09-61a5f49503d3\" (UID: \"bb374283-35b0-40ec-8d09-61a5f49503d3\") " Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.178144 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb374283-35b0-40ec-8d09-61a5f49503d3-utilities" (OuterVolumeSpecName: "utilities") pod "bb374283-35b0-40ec-8d09-61a5f49503d3" (UID: "bb374283-35b0-40ec-8d09-61a5f49503d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.184264 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb374283-35b0-40ec-8d09-61a5f49503d3-kube-api-access-rztl8" (OuterVolumeSpecName: "kube-api-access-rztl8") pod "bb374283-35b0-40ec-8d09-61a5f49503d3" (UID: "bb374283-35b0-40ec-8d09-61a5f49503d3"). InnerVolumeSpecName "kube-api-access-rztl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.279625 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb374283-35b0-40ec-8d09-61a5f49503d3-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.279657 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rztl8\" (UniqueName: \"kubernetes.io/projected/bb374283-35b0-40ec-8d09-61a5f49503d3-kube-api-access-rztl8\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.375532 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb374283-35b0-40ec-8d09-61a5f49503d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb374283-35b0-40ec-8d09-61a5f49503d3" (UID: "bb374283-35b0-40ec-8d09-61a5f49503d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.381224 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb374283-35b0-40ec-8d09-61a5f49503d3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.616807 4941 generic.go:334] "Generic (PLEG): container finished" podID="bb374283-35b0-40ec-8d09-61a5f49503d3" containerID="33945ecd97d3d7b48967e0e92de91b838119ad88a652f6df5b78dfde98860eee" exitCode=0 Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.616875 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92d94" event={"ID":"bb374283-35b0-40ec-8d09-61a5f49503d3","Type":"ContainerDied","Data":"33945ecd97d3d7b48967e0e92de91b838119ad88a652f6df5b78dfde98860eee"} Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.616916 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92d94" event={"ID":"bb374283-35b0-40ec-8d09-61a5f49503d3","Type":"ContainerDied","Data":"f85ea8586bcbda3ed36e38ed67d4fe464b6791da8be1e308dd01aa802c662adf"} Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.616947 4941 scope.go:117] "RemoveContainer" containerID="33945ecd97d3d7b48967e0e92de91b838119ad88a652f6df5b78dfde98860eee" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.617150 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92d94" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.666007 4941 scope.go:117] "RemoveContainer" containerID="e9c63d86ddb04f5568bcde1a34c43c5f8857fc2125ef6a590e1fb1e61c7b9c24" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.670527 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-92d94"] Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.684116 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-92d94"] Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.702526 4941 scope.go:117] "RemoveContainer" containerID="02944a20c05d56d3479f8f7713f1b62d3cf7bca1024341764ddf393c32ed0753" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.752215 4941 scope.go:117] "RemoveContainer" containerID="33945ecd97d3d7b48967e0e92de91b838119ad88a652f6df5b78dfde98860eee" Mar 07 07:17:01 crc kubenswrapper[4941]: E0307 07:17:01.752812 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33945ecd97d3d7b48967e0e92de91b838119ad88a652f6df5b78dfde98860eee\": container with ID starting with 33945ecd97d3d7b48967e0e92de91b838119ad88a652f6df5b78dfde98860eee not found: ID does not exist" containerID="33945ecd97d3d7b48967e0e92de91b838119ad88a652f6df5b78dfde98860eee" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.752865 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33945ecd97d3d7b48967e0e92de91b838119ad88a652f6df5b78dfde98860eee"} err="failed to get container status \"33945ecd97d3d7b48967e0e92de91b838119ad88a652f6df5b78dfde98860eee\": rpc error: code = NotFound desc = could not find container \"33945ecd97d3d7b48967e0e92de91b838119ad88a652f6df5b78dfde98860eee\": container with ID starting with 33945ecd97d3d7b48967e0e92de91b838119ad88a652f6df5b78dfde98860eee not found: ID does not exist" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.752891 4941 scope.go:117] "RemoveContainer" containerID="e9c63d86ddb04f5568bcde1a34c43c5f8857fc2125ef6a590e1fb1e61c7b9c24" Mar 07 07:17:01 crc kubenswrapper[4941]: E0307 07:17:01.753374 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c63d86ddb04f5568bcde1a34c43c5f8857fc2125ef6a590e1fb1e61c7b9c24\": container with ID starting with e9c63d86ddb04f5568bcde1a34c43c5f8857fc2125ef6a590e1fb1e61c7b9c24 not found: ID does not exist" containerID="e9c63d86ddb04f5568bcde1a34c43c5f8857fc2125ef6a590e1fb1e61c7b9c24" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.753420 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c63d86ddb04f5568bcde1a34c43c5f8857fc2125ef6a590e1fb1e61c7b9c24"} err="failed to get container status \"e9c63d86ddb04f5568bcde1a34c43c5f8857fc2125ef6a590e1fb1e61c7b9c24\": rpc error: code = NotFound desc = could not find container \"e9c63d86ddb04f5568bcde1a34c43c5f8857fc2125ef6a590e1fb1e61c7b9c24\": container with ID starting with e9c63d86ddb04f5568bcde1a34c43c5f8857fc2125ef6a590e1fb1e61c7b9c24 not found: ID does not exist" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.753438 4941 scope.go:117] "RemoveContainer" containerID="02944a20c05d56d3479f8f7713f1b62d3cf7bca1024341764ddf393c32ed0753" Mar 07 07:17:01 crc kubenswrapper[4941]: E0307 07:17:01.753907 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02944a20c05d56d3479f8f7713f1b62d3cf7bca1024341764ddf393c32ed0753\": container with ID starting with 02944a20c05d56d3479f8f7713f1b62d3cf7bca1024341764ddf393c32ed0753 not found: ID does not exist" containerID="02944a20c05d56d3479f8f7713f1b62d3cf7bca1024341764ddf393c32ed0753" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.753969 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02944a20c05d56d3479f8f7713f1b62d3cf7bca1024341764ddf393c32ed0753"} err="failed to get container status \"02944a20c05d56d3479f8f7713f1b62d3cf7bca1024341764ddf393c32ed0753\": rpc error: code = NotFound desc = could not find container \"02944a20c05d56d3479f8f7713f1b62d3cf7bca1024341764ddf393c32ed0753\": container with ID starting with 02944a20c05d56d3479f8f7713f1b62d3cf7bca1024341764ddf393c32ed0753 not found: ID does not exist" Mar 07 07:17:01 crc kubenswrapper[4941]: I0307 07:17:01.970228 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb374283-35b0-40ec-8d09-61a5f49503d3" path="/var/lib/kubelet/pods/bb374283-35b0-40ec-8d09-61a5f49503d3/volumes" Mar 07 07:17:05 crc kubenswrapper[4941]: I0307 07:17:05.034208 4941 scope.go:117] "RemoveContainer" containerID="6c7d9ada163db1d30ec3ff5c17caf22f26609e7b8d28d9fa92e24fda8b2a54d8" Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.482746 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.483501 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="b030b241-21f3-48a4-88de-c63abeddccb1" containerName="openstackclient" containerID="cri-o://750b1358dd25ebdb02604816fc6ce9f0509325964bef5a9bdfbf090a38266760" gracePeriod=2 Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.492125 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.669919 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.670157 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="317acc48-d39a-4c99-8a4e-ef91b0fc3894" containerName="cinder-scheduler" containerID="cri-o://1a0ea8a8f3c822cadf12d5e4208a3d4ccded7ff8311a82b01edce4e26bce47c2" gracePeriod=30 Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.670562 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="317acc48-d39a-4c99-8a4e-ef91b0fc3894" containerName="probe" containerID="cri-o://4e90b5427d15f1e301d7820993316b94e70b1e5e57e33af40b7531f4506658b7" gracePeriod=30 Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.729592 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 07 07:17:08 crc kubenswrapper[4941]: E0307 07:17:08.847613 4941 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 07 07:17:08 crc kubenswrapper[4941]: E0307 07:17:08.848082 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data podName:3963d293-d9e9-44b6-b0a5-b1532b4a0a31 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:09.348059746 +0000 UTC m=+1526.300425211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data") pod "rabbitmq-server-0" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31") : configmap "rabbitmq-config-data" not found Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.872387 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.872685 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="753c78f9-47e6-4098-91fa-9adac0997ba4" containerName="cinder-api-log" containerID="cri-o://36e0c01fd1cc1fab82790d367c1f67d709d44426d74406326bf00d6f4c0369ff" gracePeriod=30 Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.873085 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="753c78f9-47e6-4098-91fa-9adac0997ba4" containerName="cinder-api" containerID="cri-o://7309868f4caab95c79325c4137c9791aaa3b778c28a0d6e39b6d6ff175e4b90e" gracePeriod=30 Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.909487 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jjz5s"] Mar 07 07:17:08 crc kubenswrapper[4941]: E0307 07:17:08.909910 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb374283-35b0-40ec-8d09-61a5f49503d3" containerName="extract-utilities" Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.909926 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb374283-35b0-40ec-8d09-61a5f49503d3" containerName="extract-utilities" Mar 07 07:17:08 crc kubenswrapper[4941]: E0307 07:17:08.909949 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b030b241-21f3-48a4-88de-c63abeddccb1" containerName="openstackclient" Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.909955 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b030b241-21f3-48a4-88de-c63abeddccb1" containerName="openstackclient" Mar 07 07:17:08 crc kubenswrapper[4941]: E0307 07:17:08.909970 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb374283-35b0-40ec-8d09-61a5f49503d3" containerName="registry-server" Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.909976 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb374283-35b0-40ec-8d09-61a5f49503d3" containerName="registry-server" Mar 07 07:17:08 crc kubenswrapper[4941]: E0307 07:17:08.909983 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb374283-35b0-40ec-8d09-61a5f49503d3" containerName="extract-content" Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.909991 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb374283-35b0-40ec-8d09-61a5f49503d3" containerName="extract-content" Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.910165 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb374283-35b0-40ec-8d09-61a5f49503d3" containerName="registry-server" Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.910186 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="b030b241-21f3-48a4-88de-c63abeddccb1" containerName="openstackclient" Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.910778 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jjz5s" Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.936688 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 07 07:17:08 crc kubenswrapper[4941]: I0307 07:17:08.948486 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jjz5s"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.029463 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-1815-account-create-update-8g7ft"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.030733 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1815-account-create-update-8g7ft" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.033509 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.052998 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62j4x\" (UniqueName: \"kubernetes.io/projected/99ca3e53-9ebc-464c-ac37-51163b9bc104-kube-api-access-62j4x\") pod \"root-account-create-update-jjz5s\" (UID: \"99ca3e53-9ebc-464c-ac37-51163b9bc104\") " pod="openstack/root-account-create-update-jjz5s" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.053053 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99ca3e53-9ebc-464c-ac37-51163b9bc104-operator-scripts\") pod \"root-account-create-update-jjz5s\" (UID: \"99ca3e53-9ebc-464c-ac37-51163b9bc104\") " pod="openstack/root-account-create-update-jjz5s" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.101956 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1815-account-create-update-8g7ft"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.120051 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.120245 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="d1ad12db-0b25-4e03-8772-de047be41b0d" containerName="ovn-northd" containerID="cri-o://f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e" gracePeriod=30 Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.120601 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="d1ad12db-0b25-4e03-8772-de047be41b0d" containerName="openstack-network-exporter" containerID="cri-o://98b33015c03bac854741c1b04cb6494dce5e0047189adcafb6d92d466315ec75" gracePeriod=30 Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.138473 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-8fkwt"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.138703 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-8fkwt" podUID="88ccdd50-0997-4e6e-9e05-3555379221a0" containerName="openstack-network-exporter" containerID="cri-o://2dd2f8674fc7368ace30b5ccbaa4c590e6795b851ad19939a0b2b02851b97fd8" gracePeriod=30 Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.154565 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79039479-0c9b-4931-8d9a-84271be3fee5-operator-scripts\") pod \"cinder-1815-account-create-update-8g7ft\" (UID: \"79039479-0c9b-4931-8d9a-84271be3fee5\") " pod="openstack/cinder-1815-account-create-update-8g7ft" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.154679 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62j4x\" (UniqueName: \"kubernetes.io/projected/99ca3e53-9ebc-464c-ac37-51163b9bc104-kube-api-access-62j4x\") pod \"root-account-create-update-jjz5s\" (UID: \"99ca3e53-9ebc-464c-ac37-51163b9bc104\") " pod="openstack/root-account-create-update-jjz5s" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.154711 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99ca3e53-9ebc-464c-ac37-51163b9bc104-operator-scripts\") pod \"root-account-create-update-jjz5s\" (UID: \"99ca3e53-9ebc-464c-ac37-51163b9bc104\") " pod="openstack/root-account-create-update-jjz5s" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.154736 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhxsf\" (UniqueName: \"kubernetes.io/projected/79039479-0c9b-4931-8d9a-84271be3fee5-kube-api-access-mhxsf\") pod \"cinder-1815-account-create-update-8g7ft\" (UID: \"79039479-0c9b-4931-8d9a-84271be3fee5\") " pod="openstack/cinder-1815-account-create-update-8g7ft" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.155661 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99ca3e53-9ebc-464c-ac37-51163b9bc104-operator-scripts\") pod \"root-account-create-update-jjz5s\" (UID: \"99ca3e53-9ebc-464c-ac37-51163b9bc104\") " pod="openstack/root-account-create-update-jjz5s" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.164304 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-vrr7t"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.209159 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62j4x\" (UniqueName: \"kubernetes.io/projected/99ca3e53-9ebc-464c-ac37-51163b9bc104-kube-api-access-62j4x\") pod \"root-account-create-update-jjz5s\" (UID: \"99ca3e53-9ebc-464c-ac37-51163b9bc104\") " pod="openstack/root-account-create-update-jjz5s" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.220458 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.233638 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jjz5s" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.256528 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhxsf\" (UniqueName: \"kubernetes.io/projected/79039479-0c9b-4931-8d9a-84271be3fee5-kube-api-access-mhxsf\") pod \"cinder-1815-account-create-update-8g7ft\" (UID: \"79039479-0c9b-4931-8d9a-84271be3fee5\") " pod="openstack/cinder-1815-account-create-update-8g7ft" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.256621 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79039479-0c9b-4931-8d9a-84271be3fee5-operator-scripts\") pod \"cinder-1815-account-create-update-8g7ft\" (UID: \"79039479-0c9b-4931-8d9a-84271be3fee5\") " pod="openstack/cinder-1815-account-create-update-8g7ft" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.257310 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79039479-0c9b-4931-8d9a-84271be3fee5-operator-scripts\") pod \"cinder-1815-account-create-update-8g7ft\" (UID: \"79039479-0c9b-4931-8d9a-84271be3fee5\") " pod="openstack/cinder-1815-account-create-update-8g7ft" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.257520 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-972xm"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.270292 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-x7fq9"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.331879 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhxsf\" (UniqueName: \"kubernetes.io/projected/79039479-0c9b-4931-8d9a-84271be3fee5-kube-api-access-mhxsf\") pod \"cinder-1815-account-create-update-8g7ft\" (UID: \"79039479-0c9b-4931-8d9a-84271be3fee5\") " pod="openstack/cinder-1815-account-create-update-8g7ft" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.348681 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1815-account-create-update-hdrpv"] Mar 07 07:17:09 crc kubenswrapper[4941]: E0307 07:17:09.362105 4941 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 07 07:17:09 crc kubenswrapper[4941]: E0307 07:17:09.362153 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data podName:3963d293-d9e9-44b6-b0a5-b1532b4a0a31 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:10.362141365 +0000 UTC m=+1527.314506830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data") pod "rabbitmq-server-0" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31") : configmap "rabbitmq-config-data" not found Mar 07 07:17:09 crc kubenswrapper[4941]: E0307 07:17:09.362841 4941 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 07 07:17:09 crc kubenswrapper[4941]: E0307 07:17:09.362864 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data podName:aeb1dd04-5b8c-49b4-bf65-be38fb8ae670 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:09.862856844 +0000 UTC m=+1526.815222299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data") pod "rabbitmq-cell1-server-0" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670") : configmap "rabbitmq-cell1-config-data" not found Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.411096 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1815-account-create-update-8g7ft" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.453371 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-1815-account-create-update-hdrpv"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.476511 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-972xm"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.496524 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-tz4v7"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.506462 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-tz4v7"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.515384 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1a10-account-create-update-crrkr"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.529310 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a10-account-create-update-crrkr" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.539729 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.602018 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqpj9\" (UniqueName: \"kubernetes.io/projected/6a1c3983-6c5e-48af-95cf-5f9536835f8d-kube-api-access-tqpj9\") pod \"neutron-1a10-account-create-update-crrkr\" (UID: \"6a1c3983-6c5e-48af-95cf-5f9536835f8d\") " pod="openstack/neutron-1a10-account-create-update-crrkr" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.602062 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a1c3983-6c5e-48af-95cf-5f9536835f8d-operator-scripts\") pod \"neutron-1a10-account-create-update-crrkr\" (UID: \"6a1c3983-6c5e-48af-95cf-5f9536835f8d\") " pod="openstack/neutron-1a10-account-create-update-crrkr" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.623702 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1a10-account-create-update-crrkr"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.696194 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1a10-account-create-update-mk8sb"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.714704 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqpj9\" (UniqueName: \"kubernetes.io/projected/6a1c3983-6c5e-48af-95cf-5f9536835f8d-kube-api-access-tqpj9\") pod \"neutron-1a10-account-create-update-crrkr\" (UID: \"6a1c3983-6c5e-48af-95cf-5f9536835f8d\") " pod="openstack/neutron-1a10-account-create-update-crrkr" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.714985 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a1c3983-6c5e-48af-95cf-5f9536835f8d-operator-scripts\") pod \"neutron-1a10-account-create-update-crrkr\" (UID: \"6a1c3983-6c5e-48af-95cf-5f9536835f8d\") " pod="openstack/neutron-1a10-account-create-update-crrkr" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.715823 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a1c3983-6c5e-48af-95cf-5f9536835f8d-operator-scripts\") pod \"neutron-1a10-account-create-update-crrkr\" (UID: \"6a1c3983-6c5e-48af-95cf-5f9536835f8d\") " pod="openstack/neutron-1a10-account-create-update-crrkr" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.729872 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1a10-account-create-update-mk8sb"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.746047 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-427c-account-create-update-f84ls"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.748318 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-427c-account-create-update-f84ls" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.758120 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-3b99-account-create-update-mzq6w"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.760185 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b99-account-create-update-mzq6w" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.772736 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.772937 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.778038 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-427c-account-create-update-f84ls"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.788491 4941 generic.go:334] "Generic (PLEG): container finished" podID="d1ad12db-0b25-4e03-8772-de047be41b0d" containerID="98b33015c03bac854741c1b04cb6494dce5e0047189adcafb6d92d466315ec75" exitCode=2 Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.788757 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d1ad12db-0b25-4e03-8772-de047be41b0d","Type":"ContainerDied","Data":"98b33015c03bac854741c1b04cb6494dce5e0047189adcafb6d92d466315ec75"} Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.797090 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqpj9\" (UniqueName: \"kubernetes.io/projected/6a1c3983-6c5e-48af-95cf-5f9536835f8d-kube-api-access-tqpj9\") pod \"neutron-1a10-account-create-update-crrkr\" (UID: \"6a1c3983-6c5e-48af-95cf-5f9536835f8d\") " pod="openstack/neutron-1a10-account-create-update-crrkr" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.798516 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8fkwt_88ccdd50-0997-4e6e-9e05-3555379221a0/openstack-network-exporter/0.log" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.798554 4941 generic.go:334] "Generic (PLEG): container finished" podID="88ccdd50-0997-4e6e-9e05-3555379221a0" containerID="2dd2f8674fc7368ace30b5ccbaa4c590e6795b851ad19939a0b2b02851b97fd8" exitCode=2 Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.798602 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8fkwt" event={"ID":"88ccdd50-0997-4e6e-9e05-3555379221a0","Type":"ContainerDied","Data":"2dd2f8674fc7368ace30b5ccbaa4c590e6795b851ad19939a0b2b02851b97fd8"} Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.800322 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3b99-account-create-update-mzq6w"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.816523 4941 generic.go:334] "Generic (PLEG): container finished" podID="753c78f9-47e6-4098-91fa-9adac0997ba4" containerID="36e0c01fd1cc1fab82790d367c1f67d709d44426d74406326bf00d6f4c0369ff" exitCode=143 Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.816560 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"753c78f9-47e6-4098-91fa-9adac0997ba4","Type":"ContainerDied","Data":"36e0c01fd1cc1fab82790d367c1f67d709d44426d74406326bf00d6f4c0369ff"} Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.817470 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-5379-account-create-update-6728g"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.818517 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5379-account-create-update-6728g" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.819442 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d-operator-scripts\") pod \"nova-api-427c-account-create-update-f84ls\" (UID: \"6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d\") " pod="openstack/nova-api-427c-account-create-update-f84ls" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.819584 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzbp6\" (UniqueName: \"kubernetes.io/projected/6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d-kube-api-access-bzbp6\") pod \"nova-api-427c-account-create-update-f84ls\" (UID: \"6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d\") " pod="openstack/nova-api-427c-account-create-update-f84ls" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.823702 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.836886 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5379-account-create-update-6728g"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.875332 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-n6fhd"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.892634 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-n6fhd"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.904081 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-27tlg"] Mar 07 07:17:09 crc kubenswrapper[4941]: E0307 07:17:09.914326 4941 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-x7fq9" message=< Mar 07 07:17:09 crc kubenswrapper[4941]: Exiting ovn-controller (1) [ OK ] Mar 07 07:17:09 crc kubenswrapper[4941]: > Mar 07 07:17:09 crc kubenswrapper[4941]: E0307 07:17:09.914366 4941 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-x7fq9" podUID="5f4f0d58-e159-427f-8cca-95525d4968cd" containerName="ovn-controller" containerID="cri-o://4d3c4f1cb725db64e33869898d4504671134f70b853278179133c55e028ed533" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.914419 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-x7fq9" podUID="5f4f0d58-e159-427f-8cca-95525d4968cd" containerName="ovn-controller" containerID="cri-o://4d3c4f1cb725db64e33869898d4504671134f70b853278179133c55e028ed533" gracePeriod=30 Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.917029 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-27tlg"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.925299 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2ztz\" (UniqueName: \"kubernetes.io/projected/b055e9de-2e86-467d-9e93-8fd06977cc87-kube-api-access-g2ztz\") pod \"nova-cell0-3b99-account-create-update-mzq6w\" (UID: \"b055e9de-2e86-467d-9e93-8fd06977cc87\") " pod="openstack/nova-cell0-3b99-account-create-update-mzq6w" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.925371 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d-operator-scripts\") pod \"nova-api-427c-account-create-update-f84ls\" (UID: \"6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d\") " pod="openstack/nova-api-427c-account-create-update-f84ls" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.925438 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzbp6\" (UniqueName: \"kubernetes.io/projected/6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d-kube-api-access-bzbp6\") pod \"nova-api-427c-account-create-update-f84ls\" (UID: \"6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d\") " pod="openstack/nova-api-427c-account-create-update-f84ls" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.925479 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b055e9de-2e86-467d-9e93-8fd06977cc87-operator-scripts\") pod \"nova-cell0-3b99-account-create-update-mzq6w\" (UID: \"b055e9de-2e86-467d-9e93-8fd06977cc87\") " pod="openstack/nova-cell0-3b99-account-create-update-mzq6w" Mar 07 07:17:09 crc kubenswrapper[4941]: E0307 07:17:09.925716 4941 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 07 07:17:09 crc kubenswrapper[4941]: E0307 07:17:09.925768 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data podName:aeb1dd04-5b8c-49b4-bf65-be38fb8ae670 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:10.925749942 +0000 UTC m=+1527.878115407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data") pod "rabbitmq-cell1-server-0" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670") : configmap "rabbitmq-cell1-config-data" not found Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.926914 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d-operator-scripts\") pod \"nova-api-427c-account-create-update-f84ls\" (UID: \"6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d\") " pod="openstack/nova-api-427c-account-create-update-f84ls" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.938218 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-m9v9g"] Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.961866 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzbp6\" (UniqueName: \"kubernetes.io/projected/6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d-kube-api-access-bzbp6\") pod \"nova-api-427c-account-create-update-f84ls\" (UID: \"6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d\") " pod="openstack/nova-api-427c-account-create-update-f84ls" Mar 07 07:17:09 crc kubenswrapper[4941]: I0307 07:17:09.995890 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a10-account-create-update-crrkr" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.030420 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53e374be-8342-42ac-a82a-75854d38e098-operator-scripts\") pod \"nova-cell1-5379-account-create-update-6728g\" (UID: \"53e374be-8342-42ac-a82a-75854d38e098\") " pod="openstack/nova-cell1-5379-account-create-update-6728g" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.030813 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prrs5\" (UniqueName: \"kubernetes.io/projected/53e374be-8342-42ac-a82a-75854d38e098-kube-api-access-prrs5\") pod \"nova-cell1-5379-account-create-update-6728g\" (UID: \"53e374be-8342-42ac-a82a-75854d38e098\") " pod="openstack/nova-cell1-5379-account-create-update-6728g" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.030853 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2ztz\" (UniqueName: \"kubernetes.io/projected/b055e9de-2e86-467d-9e93-8fd06977cc87-kube-api-access-g2ztz\") pod \"nova-cell0-3b99-account-create-update-mzq6w\" (UID: \"b055e9de-2e86-467d-9e93-8fd06977cc87\") " pod="openstack/nova-cell0-3b99-account-create-update-mzq6w" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.030927 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b055e9de-2e86-467d-9e93-8fd06977cc87-operator-scripts\") pod \"nova-cell0-3b99-account-create-update-mzq6w\" (UID: \"b055e9de-2e86-467d-9e93-8fd06977cc87\") " pod="openstack/nova-cell0-3b99-account-create-update-mzq6w" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.065474 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b055e9de-2e86-467d-9e93-8fd06977cc87-operator-scripts\") pod \"nova-cell0-3b99-account-create-update-mzq6w\" (UID: \"b055e9de-2e86-467d-9e93-8fd06977cc87\") " pod="openstack/nova-cell0-3b99-account-create-update-mzq6w" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.077187 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1b17cf-5cc0-4c89-8757-7cc78a79a94f" path="/var/lib/kubelet/pods/2a1b17cf-5cc0-4c89-8757-7cc78a79a94f/volumes" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.077735 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d89d1d4-04b0-4778-98d7-1cc12db0588b" path="/var/lib/kubelet/pods/3d89d1d4-04b0-4778-98d7-1cc12db0588b/volumes" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.078255 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bdf20f4-25fe-480f-9d5a-f593b6d9a763" path="/var/lib/kubelet/pods/6bdf20f4-25fe-480f-9d5a-f593b6d9a763/volumes" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.103866 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2ztz\" (UniqueName: \"kubernetes.io/projected/b055e9de-2e86-467d-9e93-8fd06977cc87-kube-api-access-g2ztz\") pod \"nova-cell0-3b99-account-create-update-mzq6w\" (UID: \"b055e9de-2e86-467d-9e93-8fd06977cc87\") " pod="openstack/nova-cell0-3b99-account-create-update-mzq6w" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.127456 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be73b1fb-9f01-4e2b-a4fa-7f004be742e3" path="/var/lib/kubelet/pods/be73b1fb-9f01-4e2b-a4fa-7f004be742e3/volumes" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.128551 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf0adf58-bd20-433a-a80c-0a871ec201b4" path="/var/lib/kubelet/pods/bf0adf58-bd20-433a-a80c-0a871ec201b4/volumes" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.129110 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc655c4d-3dd7-40c6-85c9-d53daedf8a65" path="/var/lib/kubelet/pods/dc655c4d-3dd7-40c6-85c9-d53daedf8a65/volumes" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.129689 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-m9v9g"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.129715 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-427c-account-create-update-fzfss"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.129726 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-427c-account-create-update-fzfss"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.129736 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.129749 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.129759 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3b99-account-create-update-mr8p7"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.129768 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-3b99-account-create-update-mr8p7"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.130914 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-auditor" containerID="cri-o://2e027b31903a84225494031f036fcf7247ad6301c5f5dde58c4ee1c14cce7c11" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.131336 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="32aba6f1-c08f-4826-8492-9f2979275f5e" containerName="openstack-network-exporter" containerID="cri-o://067c949e419323cc676040b4fd78ae141b059399cfde952987dfd044a128e4f0" gracePeriod=300 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.131731 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="swift-recon-cron" containerID="cri-o://de3ca98cd25b474d3e353f72d65beb19ff275720c6069dd01c5f7583298062ed" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.131865 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="rsync" containerID="cri-o://85270965dc437741cea69bd912b61c8d09026d3d258607bf435fad06544d3fd6" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.131995 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-expirer" containerID="cri-o://855cf93f381f91e003458131c37485c03dad7138c9edead25076ccf72ecf7f52" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.132100 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-updater" containerID="cri-o://56cd38a5b827100634031f80ea3f6b66bba0e5e1443ae05edcfcbe4fc643efaf" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.132236 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-replicator" containerID="cri-o://b2118b6239f5287819f8bf36551a0f5192fed176c1776aa201c973673b7cea6c" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.132342 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-replicator" containerID="cri-o://abc84bd5c148347fd3e6e9bae1b3e5e71c3cfbd0e165bf4c6c5476ff169250b7" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.132446 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-server" containerID="cri-o://94f7fa69e0150f7b0164f8a024d5ff0ff408147eb7732aa59d194581d6174384" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.132558 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-updater" containerID="cri-o://3a29871d30c27b2ebf022f9baa1deedf75b3ae8ee4831d1770e23d3e54a09010" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.132647 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-auditor" containerID="cri-o://76a0c0f3d60ae71295ad9a44c1ccf6e3b855bfb182c9bd98972a884b1b52f6f8" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.132785 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-reaper" containerID="cri-o://b6365958c0e82455e8fe1908fb727efdae94a402d88da1bf60bf89283c9d3a65" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.132917 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-server" containerID="cri-o://dde308e8b7e70259a3c26190f8e91532136f44b834cef40a52f1a0a3750a50ef" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.133050 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-auditor" containerID="cri-o://c71a131c16d858150432793989349614354710b468996aed0a90a0a3b4655d57" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.133172 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-replicator" containerID="cri-o://39628dd870570071b2bd7ae179bda95dfcba64ea2358713fb6ac97b67dd7f09c" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.134227 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53e374be-8342-42ac-a82a-75854d38e098-operator-scripts\") pod \"nova-cell1-5379-account-create-update-6728g\" (UID: \"53e374be-8342-42ac-a82a-75854d38e098\") " pod="openstack/nova-cell1-5379-account-create-update-6728g" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.134352 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prrs5\" (UniqueName: \"kubernetes.io/projected/53e374be-8342-42ac-a82a-75854d38e098-kube-api-access-prrs5\") pod \"nova-cell1-5379-account-create-update-6728g\" (UID: \"53e374be-8342-42ac-a82a-75854d38e098\") " pod="openstack/nova-cell1-5379-account-create-update-6728g" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.136087 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53e374be-8342-42ac-a82a-75854d38e098-operator-scripts\") pod \"nova-cell1-5379-account-create-update-6728g\" (UID: \"53e374be-8342-42ac-a82a-75854d38e098\") " pod="openstack/nova-cell1-5379-account-create-update-6728g" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.158103 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.158432 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="ec9ebffe-8c04-481b-a187-bcdcca1a49a9" containerName="openstack-network-exporter" containerID="cri-o://0fdfa5c28298504762261e59ae8634154d32cea145ca796274ab84812c8beeb1" gracePeriod=300 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.130392 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-server" containerID="cri-o://b9f094df208cb97311346655039b847b135f62d9985a004099e08baeed2fee89" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.215152 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prrs5\" (UniqueName: \"kubernetes.io/projected/53e374be-8342-42ac-a82a-75854d38e098-kube-api-access-prrs5\") pod \"nova-cell1-5379-account-create-update-6728g\" (UID: \"53e374be-8342-42ac-a82a-75854d38e098\") " pod="openstack/nova-cell1-5379-account-create-update-6728g" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.215901 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-427c-account-create-update-f84ls" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.233688 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5379-account-create-update-6728g" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.310902 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-k7f6s"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.319329 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.319409 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.355627 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="32aba6f1-c08f-4826-8492-9f2979275f5e" containerName="ovsdbserver-sb" containerID="cri-o://be456d1e5f8553ef85166e01f835ac78767170cdd6c5ad60cdfc7756602760a6" gracePeriod=300 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.359585 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-k7f6s"] Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.362835 4941 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.362890 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53e374be-8342-42ac-a82a-75854d38e098-operator-scripts podName:53e374be-8342-42ac-a82a-75854d38e098 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:10.862874068 +0000 UTC m=+1527.815239533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/53e374be-8342-42ac-a82a-75854d38e098-operator-scripts") pod "nova-cell1-5379-account-create-update-6728g" (UID: "53e374be-8342-42ac-a82a-75854d38e098") : configmap "openstack-cell1-scripts" not found Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.365315 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b99-account-create-update-mzq6w" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.399251 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-5379-account-create-update-v8gsj"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.449853 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-5379-account-create-update-v8gsj"] Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.472748 4941 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.472863 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data podName:3963d293-d9e9-44b6-b0a5-b1532b4a0a31 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:12.472824789 +0000 UTC m=+1529.425190254 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data") pod "rabbitmq-server-0" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31") : configmap "rabbitmq-config-data" not found Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.514918 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.515167 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e6d72c12-422e-48fd-b56b-8344260e3e01" containerName="glance-log" containerID="cri-o://cee97226fd2fe2196abc3b2a74e837c3f4b06a16e8e4628edcae67b979c11f70" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.515571 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e6d72c12-422e-48fd-b56b-8344260e3e01" containerName="glance-httpd" containerID="cri-o://ed43789861becd87eee81a4232f20de6afb6f8198fc9dd762f6924dee8e81bc0" gracePeriod=30 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.559671 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jswzb"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.587516 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jswzb"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.617371 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="ec9ebffe-8c04-481b-a187-bcdcca1a49a9" containerName="ovsdbserver-nb" containerID="cri-o://ea2fc562d127ab81fb4b03a280065325184993b1b4cb50e32728a5bc194bdcad" gracePeriod=300 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.638681 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dd56c4d5-xmpfq"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.638925 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" podUID="1198c32e-6783-497e-a232-5dd01865ecfd" containerName="dnsmasq-dns" containerID="cri-o://5125f9279c7ec81414bff41771650af0c2b88b60d9d5f2957f576c5e9948b646" gracePeriod=10 Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.648813 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ea2fc562d127ab81fb4b03a280065325184993b1b4cb50e32728a5bc194bdcad" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.657873 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea2fc562d127ab81fb4b03a280065325184993b1b4cb50e32728a5bc194bdcad is running failed: container process not found" containerID="ea2fc562d127ab81fb4b03a280065325184993b1b4cb50e32728a5bc194bdcad" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.659624 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea2fc562d127ab81fb4b03a280065325184993b1b4cb50e32728a5bc194bdcad is running failed: container process not found" containerID="ea2fc562d127ab81fb4b03a280065325184993b1b4cb50e32728a5bc194bdcad" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.659663 4941 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea2fc562d127ab81fb4b03a280065325184993b1b4cb50e32728a5bc194bdcad is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="ec9ebffe-8c04-481b-a187-bcdcca1a49a9" containerName="ovsdbserver-nb" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.682490 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gt9jr"] Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.682520 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4d3c4f1cb725db64e33869898d4504671134f70b853278179133c55e028ed533 is running failed: container process not found" containerID="4d3c4f1cb725db64e33869898d4504671134f70b853278179133c55e028ed533" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.688742 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4d3c4f1cb725db64e33869898d4504671134f70b853278179133c55e028ed533 is running failed: container process not found" containerID="4d3c4f1cb725db64e33869898d4504671134f70b853278179133c55e028ed533" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.689994 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4d3c4f1cb725db64e33869898d4504671134f70b853278179133c55e028ed533 is running failed: container process not found" containerID="4d3c4f1cb725db64e33869898d4504671134f70b853278179133c55e028ed533" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.690028 4941 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4d3c4f1cb725db64e33869898d4504671134f70b853278179133c55e028ed533 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-x7fq9" podUID="5f4f0d58-e159-427f-8cca-95525d4968cd" containerName="ovn-controller" Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.770672 4941 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 07 07:17:10 crc kubenswrapper[4941]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 07 07:17:10 crc kubenswrapper[4941]: + source /usr/local/bin/container-scripts/functions Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNBridge=br-int Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNRemote=tcp:localhost:6642 Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNEncapType=geneve Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNAvailabilityZones= Mar 07 07:17:10 crc kubenswrapper[4941]: ++ EnableChassisAsGateway=true Mar 07 07:17:10 crc kubenswrapper[4941]: ++ PhysicalNetworks= Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNHostName= Mar 07 07:17:10 crc kubenswrapper[4941]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 07 07:17:10 crc kubenswrapper[4941]: ++ ovs_dir=/var/lib/openvswitch Mar 07 07:17:10 crc kubenswrapper[4941]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 07 07:17:10 crc kubenswrapper[4941]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 07 07:17:10 crc kubenswrapper[4941]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 07 07:17:10 crc kubenswrapper[4941]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 07 07:17:10 crc kubenswrapper[4941]: + sleep 0.5 Mar 07 07:17:10 crc kubenswrapper[4941]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 07 07:17:10 crc kubenswrapper[4941]: + sleep 0.5 Mar 07 07:17:10 crc kubenswrapper[4941]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 07 07:17:10 crc kubenswrapper[4941]: + cleanup_ovsdb_server_semaphore Mar 07 07:17:10 crc kubenswrapper[4941]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 07 07:17:10 crc kubenswrapper[4941]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 07 07:17:10 crc kubenswrapper[4941]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-vrr7t" message=< Mar 07 07:17:10 crc kubenswrapper[4941]: Exiting ovsdb-server (5) [ OK ] Mar 07 07:17:10 crc kubenswrapper[4941]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 07 07:17:10 crc kubenswrapper[4941]: + source /usr/local/bin/container-scripts/functions Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNBridge=br-int Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNRemote=tcp:localhost:6642 Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNEncapType=geneve Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNAvailabilityZones= Mar 07 07:17:10 crc kubenswrapper[4941]: ++ EnableChassisAsGateway=true Mar 07 07:17:10 crc kubenswrapper[4941]: ++ PhysicalNetworks= Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNHostName= Mar 07 07:17:10 crc kubenswrapper[4941]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 07 07:17:10 crc kubenswrapper[4941]: ++ ovs_dir=/var/lib/openvswitch Mar 07 07:17:10 crc kubenswrapper[4941]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 07 07:17:10 crc kubenswrapper[4941]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 07 07:17:10 crc kubenswrapper[4941]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 07 07:17:10 crc kubenswrapper[4941]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 07 07:17:10 crc kubenswrapper[4941]: + sleep 0.5 Mar 07 07:17:10 crc kubenswrapper[4941]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 07 07:17:10 crc kubenswrapper[4941]: + sleep 0.5 Mar 07 07:17:10 crc kubenswrapper[4941]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 07 07:17:10 crc kubenswrapper[4941]: + cleanup_ovsdb_server_semaphore Mar 07 07:17:10 crc kubenswrapper[4941]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 07 07:17:10 crc kubenswrapper[4941]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 07 07:17:10 crc kubenswrapper[4941]: > Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.770711 4941 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 07 07:17:10 crc kubenswrapper[4941]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 07 07:17:10 crc kubenswrapper[4941]: + source /usr/local/bin/container-scripts/functions Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNBridge=br-int Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNRemote=tcp:localhost:6642 Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNEncapType=geneve Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNAvailabilityZones= Mar 07 07:17:10 crc kubenswrapper[4941]: ++ EnableChassisAsGateway=true Mar 07 07:17:10 crc kubenswrapper[4941]: ++ PhysicalNetworks= Mar 07 07:17:10 crc kubenswrapper[4941]: ++ OVNHostName= Mar 07 07:17:10 crc kubenswrapper[4941]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 07 07:17:10 crc kubenswrapper[4941]: ++ ovs_dir=/var/lib/openvswitch Mar 07 07:17:10 crc kubenswrapper[4941]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 07 07:17:10 crc kubenswrapper[4941]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 07 07:17:10 crc kubenswrapper[4941]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 07 07:17:10 crc kubenswrapper[4941]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 07 07:17:10 crc kubenswrapper[4941]: + sleep 0.5 Mar 07 07:17:10 crc kubenswrapper[4941]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 07 07:17:10 crc kubenswrapper[4941]: + sleep 0.5 Mar 07 07:17:10 crc kubenswrapper[4941]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 07 07:17:10 crc kubenswrapper[4941]: + cleanup_ovsdb_server_semaphore Mar 07 07:17:10 crc kubenswrapper[4941]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 07 07:17:10 crc kubenswrapper[4941]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 07 07:17:10 crc kubenswrapper[4941]: > pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovsdb-server" containerID="cri-o://cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.770755 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovsdb-server" containerID="cri-o://cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" gracePeriod=29 Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.776397 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.776718 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovs-vswitchd" containerID="cri-o://bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" gracePeriod=29 Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.809144 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.811608 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gt9jr"] Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.820687 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.820754 4941 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovsdb-server" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.878041 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-ppswq"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.897051 4941 generic.go:334] "Generic (PLEG): container finished" podID="5f4f0d58-e159-427f-8cca-95525d4968cd" containerID="4d3c4f1cb725db64e33869898d4504671134f70b853278179133c55e028ed533" exitCode=0 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.897118 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7fq9" event={"ID":"5f4f0d58-e159-427f-8cca-95525d4968cd","Type":"ContainerDied","Data":"4d3c4f1cb725db64e33869898d4504671134f70b853278179133c55e028ed533"} Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.906113 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovs-vswitchd" probeResult="failure" output=< Mar 07 07:17:10 crc kubenswrapper[4941]: cat: /var/run/openvswitch/ovs-vswitchd.pid: No such file or directory Mar 07 07:17:10 crc kubenswrapper[4941]: ERROR - Failed to get pid for ovs-vswitchd, exit status: 0 Mar 07 07:17:10 crc kubenswrapper[4941]: > Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.914673 4941 generic.go:334] "Generic (PLEG): container finished" podID="1198c32e-6783-497e-a232-5dd01865ecfd" containerID="5125f9279c7ec81414bff41771650af0c2b88b60d9d5f2957f576c5e9948b646" exitCode=0 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.914729 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" event={"ID":"1198c32e-6783-497e-a232-5dd01865ecfd","Type":"ContainerDied","Data":"5125f9279c7ec81414bff41771650af0c2b88b60d9d5f2957f576c5e9948b646"} Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.919348 4941 generic.go:334] "Generic (PLEG): container finished" podID="317acc48-d39a-4c99-8a4e-ef91b0fc3894" containerID="4e90b5427d15f1e301d7820993316b94e70b1e5e57e33af40b7531f4506658b7" exitCode=0 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.919365 4941 generic.go:334] "Generic (PLEG): container finished" podID="317acc48-d39a-4c99-8a4e-ef91b0fc3894" containerID="1a0ea8a8f3c822cadf12d5e4208a3d4ccded7ff8311a82b01edce4e26bce47c2" exitCode=0 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.919421 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"317acc48-d39a-4c99-8a4e-ef91b0fc3894","Type":"ContainerDied","Data":"4e90b5427d15f1e301d7820993316b94e70b1e5e57e33af40b7531f4506658b7"} Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.919518 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"317acc48-d39a-4c99-8a4e-ef91b0fc3894","Type":"ContainerDied","Data":"1a0ea8a8f3c822cadf12d5e4208a3d4ccded7ff8311a82b01edce4e26bce47c2"} Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.920651 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-ppswq"] Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.929433 4941 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.930796 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data podName:aeb1dd04-5b8c-49b4-bf65-be38fb8ae670 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:12.930779004 +0000 UTC m=+1529.883144469 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data") pod "rabbitmq-cell1-server-0" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670") : configmap "rabbitmq-cell1-config-data" not found Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.929439 4941 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 07 07:17:10 crc kubenswrapper[4941]: E0307 07:17:10.930988 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53e374be-8342-42ac-a82a-75854d38e098-operator-scripts podName:53e374be-8342-42ac-a82a-75854d38e098 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:11.930966629 +0000 UTC m=+1528.883332094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/53e374be-8342-42ac-a82a-75854d38e098-operator-scripts") pod "nova-cell1-5379-account-create-update-6728g" (UID: "53e374be-8342-42ac-a82a-75854d38e098") : configmap "openstack-cell1-scripts" not found Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.947486 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_32aba6f1-c08f-4826-8492-9f2979275f5e/ovsdbserver-sb/0.log" Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.947793 4941 generic.go:334] "Generic (PLEG): container finished" podID="32aba6f1-c08f-4826-8492-9f2979275f5e" containerID="067c949e419323cc676040b4fd78ae141b059399cfde952987dfd044a128e4f0" exitCode=2 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.947811 4941 generic.go:334] "Generic (PLEG): container finished" podID="32aba6f1-c08f-4826-8492-9f2979275f5e" containerID="be456d1e5f8553ef85166e01f835ac78767170cdd6c5ad60cdfc7756602760a6" exitCode=143 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.947877 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"32aba6f1-c08f-4826-8492-9f2979275f5e","Type":"ContainerDied","Data":"067c949e419323cc676040b4fd78ae141b059399cfde952987dfd044a128e4f0"} Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.947903 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"32aba6f1-c08f-4826-8492-9f2979275f5e","Type":"ContainerDied","Data":"be456d1e5f8553ef85166e01f835ac78767170cdd6c5ad60cdfc7756602760a6"} Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.958785 4941 generic.go:334] "Generic (PLEG): container finished" podID="531af2a1-d934-48a5-b3de-61d475bf252f" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" exitCode=0 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.958899 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vrr7t" event={"ID":"531af2a1-d934-48a5-b3de-61d475bf252f","Type":"ContainerDied","Data":"cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd"} Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.961437 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-cwhp5"] Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.976324 4941 generic.go:334] "Generic (PLEG): container finished" podID="e6d72c12-422e-48fd-b56b-8344260e3e01" containerID="cee97226fd2fe2196abc3b2a74e837c3f4b06a16e8e4628edcae67b979c11f70" exitCode=143 Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.976435 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e6d72c12-422e-48fd-b56b-8344260e3e01","Type":"ContainerDied","Data":"cee97226fd2fe2196abc3b2a74e837c3f4b06a16e8e4628edcae67b979c11f70"} Mar 07 07:17:10 crc kubenswrapper[4941]: I0307 07:17:10.996610 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-cwhp5"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.025497 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-685ff95674-ldzd4"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.025751 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-685ff95674-ldzd4" podUID="ad2b6a75-839f-4fec-9f12-fb520b44c7ce" containerName="placement-log" containerID="cri-o://f8fe04edd08619bd4118ac72769de11cb1dc8824ae2dfa0799d60d9d7cab0731" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.026180 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-685ff95674-ldzd4" podUID="ad2b6a75-839f-4fec-9f12-fb520b44c7ce" containerName="placement-api" containerID="cri-o://fb6516769d261d733fc9be130e2e6aea292a7c5ff2e94299bbabd569bbe859a7" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052215 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="85270965dc437741cea69bd912b61c8d09026d3d258607bf435fad06544d3fd6" exitCode=0 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052244 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="855cf93f381f91e003458131c37485c03dad7138c9edead25076ccf72ecf7f52" exitCode=0 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052252 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="56cd38a5b827100634031f80ea3f6b66bba0e5e1443ae05edcfcbe4fc643efaf" exitCode=0 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052260 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="2e027b31903a84225494031f036fcf7247ad6301c5f5dde58c4ee1c14cce7c11" exitCode=0 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052267 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="abc84bd5c148347fd3e6e9bae1b3e5e71c3cfbd0e165bf4c6c5476ff169250b7" exitCode=0 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052273 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="3a29871d30c27b2ebf022f9baa1deedf75b3ae8ee4831d1770e23d3e54a09010" exitCode=0 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052279 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="76a0c0f3d60ae71295ad9a44c1ccf6e3b855bfb182c9bd98972a884b1b52f6f8" exitCode=0 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052286 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="b2118b6239f5287819f8bf36551a0f5192fed176c1776aa201c973673b7cea6c" exitCode=0 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052293 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="b6365958c0e82455e8fe1908fb727efdae94a402d88da1bf60bf89283c9d3a65" exitCode=0 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052299 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="c71a131c16d858150432793989349614354710b468996aed0a90a0a3b4655d57" exitCode=0 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052305 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="39628dd870570071b2bd7ae179bda95dfcba64ea2358713fb6ac97b67dd7f09c" exitCode=0 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052310 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="b9f094df208cb97311346655039b847b135f62d9985a004099e08baeed2fee89" exitCode=0 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052347 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"85270965dc437741cea69bd912b61c8d09026d3d258607bf435fad06544d3fd6"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052372 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"855cf93f381f91e003458131c37485c03dad7138c9edead25076ccf72ecf7f52"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052381 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"56cd38a5b827100634031f80ea3f6b66bba0e5e1443ae05edcfcbe4fc643efaf"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052390 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"2e027b31903a84225494031f036fcf7247ad6301c5f5dde58c4ee1c14cce7c11"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052418 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"abc84bd5c148347fd3e6e9bae1b3e5e71c3cfbd0e165bf4c6c5476ff169250b7"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052428 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"3a29871d30c27b2ebf022f9baa1deedf75b3ae8ee4831d1770e23d3e54a09010"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052436 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"76a0c0f3d60ae71295ad9a44c1ccf6e3b855bfb182c9bd98972a884b1b52f6f8"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052445 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"b2118b6239f5287819f8bf36551a0f5192fed176c1776aa201c973673b7cea6c"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052453 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"b6365958c0e82455e8fe1908fb727efdae94a402d88da1bf60bf89283c9d3a65"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052463 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"c71a131c16d858150432793989349614354710b468996aed0a90a0a3b4655d57"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052471 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"39628dd870570071b2bd7ae179bda95dfcba64ea2358713fb6ac97b67dd7f09c"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.052479 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"b9f094df208cb97311346655039b847b135f62d9985a004099e08baeed2fee89"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.058492 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.058762 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" containerName="glance-log" containerID="cri-o://b048a7900d858225d4830b842a8b3ed5f22a79a6146d7e6677222d62c98ef913" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.059011 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" containerName="glance-httpd" containerID="cri-o://885046a8a3875aba6340cf20fd6d156d9145acc04deb00a3dc3da2bb74d33aa4" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.074563 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ec9ebffe-8c04-481b-a187-bcdcca1a49a9/ovsdbserver-nb/0.log" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.074603 4941 generic.go:334] "Generic (PLEG): container finished" podID="ec9ebffe-8c04-481b-a187-bcdcca1a49a9" containerID="0fdfa5c28298504762261e59ae8634154d32cea145ca796274ab84812c8beeb1" exitCode=2 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.074621 4941 generic.go:334] "Generic (PLEG): container finished" podID="ec9ebffe-8c04-481b-a187-bcdcca1a49a9" containerID="ea2fc562d127ab81fb4b03a280065325184993b1b4cb50e32728a5bc194bdcad" exitCode=143 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.074672 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ec9ebffe-8c04-481b-a187-bcdcca1a49a9","Type":"ContainerDied","Data":"0fdfa5c28298504762261e59ae8634154d32cea145ca796274ab84812c8beeb1"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.074705 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ec9ebffe-8c04-481b-a187-bcdcca1a49a9","Type":"ContainerDied","Data":"ea2fc562d127ab81fb4b03a280065325184993b1b4cb50e32728a5bc194bdcad"} Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.093919 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.098352 4941 generic.go:334] "Generic (PLEG): container finished" podID="b030b241-21f3-48a4-88de-c63abeddccb1" containerID="750b1358dd25ebdb02604816fc6ce9f0509325964bef5a9bdfbf090a38266760" exitCode=137 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.175773 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-040c-account-create-update-6phkj"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.195230 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-040c-account-create-update-6phkj"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.232935 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3963d293-d9e9-44b6-b0a5-b1532b4a0a31" containerName="rabbitmq" containerID="cri-o://fb34c82f5b3e748bcf461b8c680f622fbd3d3b441013c66d1fb770a45f687c55" gracePeriod=604800 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.321202 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c6df5b777-qhsgz"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.321860 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c6df5b777-qhsgz" podUID="d3cb3645-4e27-450f-a712-f656dfa9e8e1" containerName="neutron-api" containerID="cri-o://f2776ebe60f7ea12aae1e69439654467871a600e01895f035b01fd91b4e7c2d0" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.322269 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c6df5b777-qhsgz" podUID="d3cb3645-4e27-450f-a712-f656dfa9e8e1" containerName="neutron-httpd" containerID="cri-o://02af86707f926f63b94a89e930e5bad157f8952cf5491dcebdef8e9862f1f39e" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.323825 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8fkwt_88ccdd50-0997-4e6e-9e05-3555379221a0/openstack-network-exporter/0.log" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.323885 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.332050 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7fq9" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.337899 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b246-account-create-update-c975h"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.353770 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b246-account-create-update-c975h"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.362004 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-pfpsm"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.375077 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-mhtvx"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.386344 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-pfpsm"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.399397 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-mhtvx"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.408213 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-346c-account-create-update-94z8w"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.418522 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.422564 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-57855ff457-mshjt"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.422797 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-57855ff457-mshjt" podUID="e27683db-592f-485a-93b3-93273e1644c3" containerName="barbican-worker-log" containerID="cri-o://5ed5ef6d5be2531a2b10e8ac859bcb1d8a560dd65521a1b989c82ce3f9e87c02" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.423589 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-57855ff457-mshjt" podUID="e27683db-592f-485a-93b3-93273e1644c3" containerName="barbican-worker" containerID="cri-o://1c3a413ba405f95e0a0014d1315bc4171811b9889fe35c0e478a43aa0b412bf3" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.439685 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88ccdd50-0997-4e6e-9e05-3555379221a0-metrics-certs-tls-certs\") pod \"88ccdd50-0997-4e6e-9e05-3555379221a0\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.439755 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f4f0d58-e159-427f-8cca-95525d4968cd-scripts\") pod \"5f4f0d58-e159-427f-8cca-95525d4968cd\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.439796 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4f0d58-e159-427f-8cca-95525d4968cd-combined-ca-bundle\") pod \"5f4f0d58-e159-427f-8cca-95525d4968cd\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.439820 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmskj\" (UniqueName: \"kubernetes.io/projected/5f4f0d58-e159-427f-8cca-95525d4968cd-kube-api-access-qmskj\") pod \"5f4f0d58-e159-427f-8cca-95525d4968cd\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.439838 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dh27\" (UniqueName: \"kubernetes.io/projected/88ccdd50-0997-4e6e-9e05-3555379221a0-kube-api-access-2dh27\") pod \"88ccdd50-0997-4e6e-9e05-3555379221a0\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.439877 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b030b241-21f3-48a4-88de-c63abeddccb1-openstack-config-secret\") pod \"b030b241-21f3-48a4-88de-c63abeddccb1\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.439925 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-run-ovn\") pod \"5f4f0d58-e159-427f-8cca-95525d4968cd\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.439946 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88ccdd50-0997-4e6e-9e05-3555379221a0-ovn-rundir\") pod \"88ccdd50-0997-4e6e-9e05-3555379221a0\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.439968 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-run\") pod \"5f4f0d58-e159-427f-8cca-95525d4968cd\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.440001 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b030b241-21f3-48a4-88de-c63abeddccb1-combined-ca-bundle\") pod \"b030b241-21f3-48a4-88de-c63abeddccb1\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.440033 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwv7c\" (UniqueName: \"kubernetes.io/projected/b030b241-21f3-48a4-88de-c63abeddccb1-kube-api-access-dwv7c\") pod \"b030b241-21f3-48a4-88de-c63abeddccb1\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.440064 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-log-ovn\") pod \"5f4f0d58-e159-427f-8cca-95525d4968cd\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.440083 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f4f0d58-e159-427f-8cca-95525d4968cd-ovn-controller-tls-certs\") pod \"5f4f0d58-e159-427f-8cca-95525d4968cd\" (UID: \"5f4f0d58-e159-427f-8cca-95525d4968cd\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.440111 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88ccdd50-0997-4e6e-9e05-3555379221a0-config\") pod \"88ccdd50-0997-4e6e-9e05-3555379221a0\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.440139 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88ccdd50-0997-4e6e-9e05-3555379221a0-ovs-rundir\") pod \"88ccdd50-0997-4e6e-9e05-3555379221a0\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.440165 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ccdd50-0997-4e6e-9e05-3555379221a0-combined-ca-bundle\") pod \"88ccdd50-0997-4e6e-9e05-3555379221a0\" (UID: \"88ccdd50-0997-4e6e-9e05-3555379221a0\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.440198 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b030b241-21f3-48a4-88de-c63abeddccb1-openstack-config\") pod \"b030b241-21f3-48a4-88de-c63abeddccb1\" (UID: \"b030b241-21f3-48a4-88de-c63abeddccb1\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.441648 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f4f0d58-e159-427f-8cca-95525d4968cd-scripts" (OuterVolumeSpecName: "scripts") pod "5f4f0d58-e159-427f-8cca-95525d4968cd" (UID: "5f4f0d58-e159-427f-8cca-95525d4968cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.441685 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-run" (OuterVolumeSpecName: "var-run") pod "5f4f0d58-e159-427f-8cca-95525d4968cd" (UID: "5f4f0d58-e159-427f-8cca-95525d4968cd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.449569 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-346c-account-create-update-94z8w"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.450066 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5f4f0d58-e159-427f-8cca-95525d4968cd" (UID: "5f4f0d58-e159-427f-8cca-95525d4968cd"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.450124 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88ccdd50-0997-4e6e-9e05-3555379221a0-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "88ccdd50-0997-4e6e-9e05-3555379221a0" (UID: "88ccdd50-0997-4e6e-9e05-3555379221a0"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.450183 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5f4f0d58-e159-427f-8cca-95525d4968cd" (UID: "5f4f0d58-e159-427f-8cca-95525d4968cd"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.450765 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88ccdd50-0997-4e6e-9e05-3555379221a0-config" (OuterVolumeSpecName: "config") pod "88ccdd50-0997-4e6e-9e05-3555379221a0" (UID: "88ccdd50-0997-4e6e-9e05-3555379221a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.450799 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88ccdd50-0997-4e6e-9e05-3555379221a0-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "88ccdd50-0997-4e6e-9e05-3555379221a0" (UID: "88ccdd50-0997-4e6e-9e05-3555379221a0"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.458871 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.463561 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1815-account-create-update-8g7ft"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.470934 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-4bnlh"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.476997 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-4bnlh"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.483544 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ccdd50-0997-4e6e-9e05-3555379221a0-kube-api-access-2dh27" (OuterVolumeSpecName: "kube-api-access-2dh27") pod "88ccdd50-0997-4e6e-9e05-3555379221a0" (UID: "88ccdd50-0997-4e6e-9e05-3555379221a0"). InnerVolumeSpecName "kube-api-access-2dh27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.489863 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b030b241-21f3-48a4-88de-c63abeddccb1-kube-api-access-dwv7c" (OuterVolumeSpecName: "kube-api-access-dwv7c") pod "b030b241-21f3-48a4-88de-c63abeddccb1" (UID: "b030b241-21f3-48a4-88de-c63abeddccb1"). InnerVolumeSpecName "kube-api-access-dwv7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.506618 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.506875 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" containerName="nova-api-log" containerID="cri-o://e0bc59efb00016de1cf1533c35c568041ed4eee0c8728d1661b2d85f7921e10f" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.507438 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" containerName="nova-api-api" containerID="cri-o://8fd831041666d61389790739f7bec118187287e501eddb0ae0421e7a9df1ec85" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.530688 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f4f0d58-e159-427f-8cca-95525d4968cd-kube-api-access-qmskj" (OuterVolumeSpecName: "kube-api-access-qmskj") pod "5f4f0d58-e159-427f-8cca-95525d4968cd" (UID: "5f4f0d58-e159-427f-8cca-95525d4968cd"). InnerVolumeSpecName "kube-api-access-qmskj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.541253 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-config-data-custom\") pod \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.541527 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/317acc48-d39a-4c99-8a4e-ef91b0fc3894-etc-machine-id\") pod \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.549185 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlxkt\" (UniqueName: \"kubernetes.io/projected/317acc48-d39a-4c99-8a4e-ef91b0fc3894-kube-api-access-nlxkt\") pod \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.549366 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-scripts\") pod \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.549609 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-combined-ca-bundle\") pod \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.550190 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-config-data\") pod \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\" (UID: \"317acc48-d39a-4c99-8a4e-ef91b0fc3894\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.550839 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88ccdd50-0997-4e6e-9e05-3555379221a0-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.552854 4941 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88ccdd50-0997-4e6e-9e05-3555379221a0-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.552939 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f4f0d58-e159-427f-8cca-95525d4968cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.553004 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmskj\" (UniqueName: \"kubernetes.io/projected/5f4f0d58-e159-427f-8cca-95525d4968cd-kube-api-access-qmskj\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.553056 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dh27\" (UniqueName: \"kubernetes.io/projected/88ccdd50-0997-4e6e-9e05-3555379221a0-kube-api-access-2dh27\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.553108 4941 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.553159 4941 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88ccdd50-0997-4e6e-9e05-3555379221a0-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.553305 4941 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-run\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.553362 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwv7c\" (UniqueName: \"kubernetes.io/projected/b030b241-21f3-48a4-88de-c63abeddccb1-kube-api-access-dwv7c\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.553429 4941 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f4f0d58-e159-427f-8cca-95525d4968cd-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.543261 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/317acc48-d39a-4c99-8a4e-ef91b0fc3894-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "317acc48-d39a-4c99-8a4e-ef91b0fc3894" (UID: "317acc48-d39a-4c99-8a4e-ef91b0fc3894"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.566059 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.576687 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/317acc48-d39a-4c99-8a4e-ef91b0fc3894-kube-api-access-nlxkt" (OuterVolumeSpecName: "kube-api-access-nlxkt") pod "317acc48-d39a-4c99-8a4e-ef91b0fc3894" (UID: "317acc48-d39a-4c99-8a4e-ef91b0fc3894"). InnerVolumeSpecName "kube-api-access-nlxkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.576755 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1a10-account-create-update-crrkr"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.577900 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b030b241-21f3-48a4-88de-c63abeddccb1-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b030b241-21f3-48a4-88de-c63abeddccb1" (UID: "b030b241-21f3-48a4-88de-c63abeddccb1"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.577943 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-scripts" (OuterVolumeSpecName: "scripts") pod "317acc48-d39a-4c99-8a4e-ef91b0fc3894" (UID: "317acc48-d39a-4c99-8a4e-ef91b0fc3894"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.581628 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "317acc48-d39a-4c99-8a4e-ef91b0fc3894" (UID: "317acc48-d39a-4c99-8a4e-ef91b0fc3894"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.584824 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rgx2z"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.592023 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rgx2z"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.593115 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b030b241-21f3-48a4-88de-c63abeddccb1-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b030b241-21f3-48a4-88de-c63abeddccb1" (UID: "b030b241-21f3-48a4-88de-c63abeddccb1"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.598293 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ccdd50-0997-4e6e-9e05-3555379221a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88ccdd50-0997-4e6e-9e05-3555379221a0" (UID: "88ccdd50-0997-4e6e-9e05-3555379221a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.602375 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b030b241-21f3-48a4-88de-c63abeddccb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b030b241-21f3-48a4-88de-c63abeddccb1" (UID: "b030b241-21f3-48a4-88de-c63abeddccb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.605505 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-fkcb9"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.611007 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-fkcb9"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.627375 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f4f0d58-e159-427f-8cca-95525d4968cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f4f0d58-e159-427f-8cca-95525d4968cd" (UID: "5f4f0d58-e159-427f-8cca-95525d4968cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.641376 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.643793 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_32aba6f1-c08f-4826-8492-9f2979275f5e/ovsdbserver-sb/0.log" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.643874 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.652965 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.655894 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-ovsdbserver-nb\") pod \"1198c32e-6783-497e-a232-5dd01865ecfd\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.656040 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/32aba6f1-c08f-4826-8492-9f2979275f5e-ovsdb-rundir\") pod \"32aba6f1-c08f-4826-8492-9f2979275f5e\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.656144 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzp4m\" (UniqueName: \"kubernetes.io/projected/32aba6f1-c08f-4826-8492-9f2979275f5e-kube-api-access-fzp4m\") pod \"32aba6f1-c08f-4826-8492-9f2979275f5e\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.656220 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5k24\" (UniqueName: \"kubernetes.io/projected/1198c32e-6783-497e-a232-5dd01865ecfd-kube-api-access-n5k24\") pod \"1198c32e-6783-497e-a232-5dd01865ecfd\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.656284 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-config\") pod \"1198c32e-6783-497e-a232-5dd01865ecfd\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.656388 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-dns-swift-storage-0\") pod \"1198c32e-6783-497e-a232-5dd01865ecfd\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.656470 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32aba6f1-c08f-4826-8492-9f2979275f5e-config\") pod \"32aba6f1-c08f-4826-8492-9f2979275f5e\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.656552 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-metrics-certs-tls-certs\") pod \"32aba6f1-c08f-4826-8492-9f2979275f5e\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.656614 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-dns-svc\") pod \"1198c32e-6783-497e-a232-5dd01865ecfd\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.656660 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32aba6f1-c08f-4826-8492-9f2979275f5e-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "32aba6f1-c08f-4826-8492-9f2979275f5e" (UID: "32aba6f1-c08f-4826-8492-9f2979275f5e"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.656776 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-ovsdbserver-sb-tls-certs\") pod \"32aba6f1-c08f-4826-8492-9f2979275f5e\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.656844 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"32aba6f1-c08f-4826-8492-9f2979275f5e\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.656931 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-combined-ca-bundle\") pod \"32aba6f1-c08f-4826-8492-9f2979275f5e\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.656998 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32aba6f1-c08f-4826-8492-9f2979275f5e-scripts\") pod \"32aba6f1-c08f-4826-8492-9f2979275f5e\" (UID: \"32aba6f1-c08f-4826-8492-9f2979275f5e\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.657064 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-ovsdbserver-sb\") pod \"1198c32e-6783-497e-a232-5dd01865ecfd\" (UID: \"1198c32e-6783-497e-a232-5dd01865ecfd\") " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.657523 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ccdd50-0997-4e6e-9e05-3555379221a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.657587 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.657638 4941 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b030b241-21f3-48a4-88de-c63abeddccb1-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.657687 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4f0d58-e159-427f-8cca-95525d4968cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.657737 4941 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b030b241-21f3-48a4-88de-c63abeddccb1-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.657788 4941 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.657847 4941 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/317acc48-d39a-4c99-8a4e-ef91b0fc3894-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.657903 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b030b241-21f3-48a4-88de-c63abeddccb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.657954 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/32aba6f1-c08f-4826-8492-9f2979275f5e-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.658014 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlxkt\" (UniqueName: \"kubernetes.io/projected/317acc48-d39a-4c99-8a4e-ef91b0fc3894-kube-api-access-nlxkt\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.660240 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-5379-account-create-update-6728g"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.664477 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32aba6f1-c08f-4826-8492-9f2979275f5e-scripts" (OuterVolumeSpecName: "scripts") pod "32aba6f1-c08f-4826-8492-9f2979275f5e" (UID: "32aba6f1-c08f-4826-8492-9f2979275f5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.691581 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "32aba6f1-c08f-4826-8492-9f2979275f5e" (UID: "32aba6f1-c08f-4826-8492-9f2979275f5e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.696386 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32aba6f1-c08f-4826-8492-9f2979275f5e-config" (OuterVolumeSpecName: "config") pod "32aba6f1-c08f-4826-8492-9f2979275f5e" (UID: "32aba6f1-c08f-4826-8492-9f2979275f5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.706085 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3963d293-d9e9-44b6-b0a5-b1532b4a0a31" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.707383 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.707736 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="05a91fa3-14f1-4d15-bdfc-bb1fc310a913" containerName="nova-scheduler-scheduler" containerID="cri-o://ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.719057 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mc6xs"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.745589 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mc6xs"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.749904 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32aba6f1-c08f-4826-8492-9f2979275f5e-kube-api-access-fzp4m" (OuterVolumeSpecName: "kube-api-access-fzp4m") pod "32aba6f1-c08f-4826-8492-9f2979275f5e" (UID: "32aba6f1-c08f-4826-8492-9f2979275f5e"). InnerVolumeSpecName "kube-api-access-fzp4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.766978 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32aba6f1-c08f-4826-8492-9f2979275f5e-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.767032 4941 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.767049 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32aba6f1-c08f-4826-8492-9f2979275f5e-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.767064 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzp4m\" (UniqueName: \"kubernetes.io/projected/32aba6f1-c08f-4826-8492-9f2979275f5e-kube-api-access-fzp4m\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.775320 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3b99-account-create-update-mzq6w"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.823394 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1198c32e-6783-497e-a232-5dd01865ecfd-kube-api-access-n5k24" (OuterVolumeSpecName: "kube-api-access-n5k24") pod "1198c32e-6783-497e-a232-5dd01865ecfd" (UID: "1198c32e-6783-497e-a232-5dd01865ecfd"). InnerVolumeSpecName "kube-api-access-n5k24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.839876 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.840092 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c892cbf7-126c-4638-854d-18cef63c7747" containerName="nova-metadata-log" containerID="cri-o://bbeb9946a442f98e015fb840554d04da471185075dcd4f2981db1c33d0175b7d" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.840248 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c892cbf7-126c-4638-854d-18cef63c7747" containerName="nova-metadata-metadata" containerID="cri-o://dcc3d4c395eb430f79b4594643474a932cbc5f6574ab24c67ad701a47825619a" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.859692 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" containerName="rabbitmq" containerID="cri-o://f7a8e765543e88a1c6e7d28463ec6d1148163252cc8cc4989b9a46a6cdfd7693" gracePeriod=604800 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.873747 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5k24\" (UniqueName: \"kubernetes.io/projected/1198c32e-6783-497e-a232-5dd01865ecfd-kube-api-access-n5k24\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.887221 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-rnqcz"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.912493 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-rnqcz"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.919001 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ec9ebffe-8c04-481b-a187-bcdcca1a49a9/ovsdbserver-nb/0.log" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.919081 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.928314 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-86888b7b66-mgpdx"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.928847 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" podUID="ea4583b7-29d7-466d-8c3d-ad9981ebc66d" containerName="barbican-keystone-listener-log" containerID="cri-o://3303b50f68a521a17ce6ea4bdfdd7f14e530a5feea2f3c7f904b8d9a9f946eb3" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.928872 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" podUID="ea4583b7-29d7-466d-8c3d-ad9981ebc66d" containerName="barbican-keystone-listener" containerID="cri-o://22c1b092d3e9a8bdc4399876e578f0c462e9f77a91ed13354c75ce1d43091380" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.945849 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54977b5b64-bxjq6"] Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.948000 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54977b5b64-bxjq6" podUID="757b037d-b7b8-4690-93b9-ec85c5bf82db" containerName="barbican-api-log" containerID="cri-o://68a7bfdd9a6a2bbbe4e0f5d3c955d8f816e0ef8b87e76bd424b8caeec2a73bc3" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.948069 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54977b5b64-bxjq6" podUID="757b037d-b7b8-4690-93b9-ec85c5bf82db" containerName="barbican-api" containerID="cri-o://9602f92d7dcc10b2686d4e7085e3763f533c9e0a49800d5c37d92fe2fd53acf8" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: E0307 07:17:11.954589 4941 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:17:11 crc kubenswrapper[4941]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: if [ -n "nova_api" ]; then Mar 07 07:17:11 crc kubenswrapper[4941]: GRANT_DATABASE="nova_api" Mar 07 07:17:11 crc kubenswrapper[4941]: else Mar 07 07:17:11 crc kubenswrapper[4941]: GRANT_DATABASE="*" Mar 07 07:17:11 crc kubenswrapper[4941]: fi Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: # going for maximum compatibility here: Mar 07 07:17:11 crc kubenswrapper[4941]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 07 07:17:11 crc kubenswrapper[4941]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 07 07:17:11 crc kubenswrapper[4941]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 07 07:17:11 crc kubenswrapper[4941]: # support updates Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: $MYSQL_CMD < logger="UnhandledError" Mar 07 07:17:11 crc kubenswrapper[4941]: E0307 07:17:11.959323 4941 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:17:11 crc kubenswrapper[4941]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: if [ -n "cinder" ]; then Mar 07 07:17:11 crc kubenswrapper[4941]: GRANT_DATABASE="cinder" Mar 07 07:17:11 crc kubenswrapper[4941]: else Mar 07 07:17:11 crc kubenswrapper[4941]: GRANT_DATABASE="*" Mar 07 07:17:11 crc kubenswrapper[4941]: fi Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: # going for maximum compatibility here: Mar 07 07:17:11 crc kubenswrapper[4941]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 07 07:17:11 crc kubenswrapper[4941]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 07 07:17:11 crc kubenswrapper[4941]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 07 07:17:11 crc kubenswrapper[4941]: # support updates Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: $MYSQL_CMD < logger="UnhandledError" Mar 07 07:17:11 crc kubenswrapper[4941]: E0307 07:17:11.962265 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-427c-account-create-update-f84ls" podUID="6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d" Mar 07 07:17:11 crc kubenswrapper[4941]: E0307 07:17:11.971618 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-1815-account-create-update-8g7ft" podUID="79039479-0c9b-4931-8d9a-84271be3fee5" Mar 07 07:17:11 crc kubenswrapper[4941]: E0307 07:17:11.976818 4941 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 07 07:17:11 crc kubenswrapper[4941]: E0307 07:17:11.976880 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53e374be-8342-42ac-a82a-75854d38e098-operator-scripts podName:53e374be-8342-42ac-a82a-75854d38e098 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:13.976866468 +0000 UTC m=+1530.929231933 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/53e374be-8342-42ac-a82a-75854d38e098-operator-scripts") pod "nova-cell1-5379-account-create-update-6728g" (UID: "53e374be-8342-42ac-a82a-75854d38e098") : configmap "openstack-cell1-scripts" not found Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.993157 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="7b306e38-c479-45ff-93ab-ca0e0e6a3aef" containerName="galera" containerID="cri-o://4b79935f1592e509f7cf336d7d435662bd63d7104ffb2d6d1825034c8387696c" gracePeriod=30 Mar 07 07:17:11 crc kubenswrapper[4941]: E0307 07:17:11.993871 4941 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:17:11 crc kubenswrapper[4941]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: if [ -n "neutron" ]; then Mar 07 07:17:11 crc kubenswrapper[4941]: GRANT_DATABASE="neutron" Mar 07 07:17:11 crc kubenswrapper[4941]: else Mar 07 07:17:11 crc kubenswrapper[4941]: GRANT_DATABASE="*" Mar 07 07:17:11 crc kubenswrapper[4941]: fi Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: # going for maximum compatibility here: Mar 07 07:17:11 crc kubenswrapper[4941]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 07 07:17:11 crc kubenswrapper[4941]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 07 07:17:11 crc kubenswrapper[4941]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 07 07:17:11 crc kubenswrapper[4941]: # support updates Mar 07 07:17:11 crc kubenswrapper[4941]: Mar 07 07:17:11 crc kubenswrapper[4941]: $MYSQL_CMD < logger="UnhandledError" Mar 07 07:17:11 crc kubenswrapper[4941]: E0307 07:17:11.994990 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-1a10-account-create-update-crrkr" podUID="6a1c3983-6c5e-48af-95cf-5f9536835f8d" Mar 07 07:17:11 crc kubenswrapper[4941]: I0307 07:17:11.996301 4941 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.002774 4941 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:17:12 crc kubenswrapper[4941]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: if [ -n "nova_cell1" ]; then Mar 07 07:17:12 crc kubenswrapper[4941]: GRANT_DATABASE="nova_cell1" Mar 07 07:17:12 crc kubenswrapper[4941]: else Mar 07 07:17:12 crc kubenswrapper[4941]: GRANT_DATABASE="*" Mar 07 07:17:12 crc kubenswrapper[4941]: fi Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: # going for maximum compatibility here: Mar 07 07:17:12 crc kubenswrapper[4941]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 07 07:17:12 crc kubenswrapper[4941]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 07 07:17:12 crc kubenswrapper[4941]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 07 07:17:12 crc kubenswrapper[4941]: # support updates Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: $MYSQL_CMD < logger="UnhandledError" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.003345 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04152996-2000-4188-840c-1759d193c903" path="/var/lib/kubelet/pods/04152996-2000-4188-840c-1759d193c903/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.003993 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-5379-account-create-update-6728g" podUID="53e374be-8342-42ac-a82a-75854d38e098" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.004112 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e0538f9-8d7c-40cf-bc98-a165a41d1bf6" path="/var/lib/kubelet/pods/0e0538f9-8d7c-40cf-bc98-a165a41d1bf6/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.004944 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a" path="/var/lib/kubelet/pods/1db1e7c6-8f4d-41ce-bb32-947ce9bfb24a/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.005999 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25419741-acb3-497c-b0cf-c2bf78d58bd1" path="/var/lib/kubelet/pods/25419741-acb3-497c-b0cf-c2bf78d58bd1/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.011296 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a4dd8e-f8bf-4695-8883-da720a6e1efd" path="/var/lib/kubelet/pods/27a4dd8e-f8bf-4695-8883-da720a6e1efd/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.011873 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc484b5-13a7-48df-a417-3f04600f9320" path="/var/lib/kubelet/pods/2dc484b5-13a7-48df-a417-3f04600f9320/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.012420 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="471729e9-1d55-4a19-9fc7-2a5313410c46" path="/var/lib/kubelet/pods/471729e9-1d55-4a19-9fc7-2a5313410c46/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.018382 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39" path="/var/lib/kubelet/pods/4e5128d9-8fe6-4e85-8c0a-8f4a3a5a7b39/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.045946 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32aba6f1-c08f-4826-8492-9f2979275f5e" (UID: "32aba6f1-c08f-4826-8492-9f2979275f5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.062145 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f6e27d-9523-4815-9efe-bf92df44ae37" path="/var/lib/kubelet/pods/53f6e27d-9523-4815-9efe-bf92df44ae37/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.063336 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ef55d7b-622e-4660-bc2a-990353dae291" path="/var/lib/kubelet/pods/5ef55d7b-622e-4660-bc2a-990353dae291/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.064466 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="672d89d2-46b4-449f-ad71-2716d50eb2fe" path="/var/lib/kubelet/pods/672d89d2-46b4-449f-ad71-2716d50eb2fe/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.072140 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72be4758-3939-4551-89be-4927ddb81638" path="/var/lib/kubelet/pods/72be4758-3939-4551-89be-4927ddb81638/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.073090 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f249265-748f-4a9d-a23c-7bd70a62b669" path="/var/lib/kubelet/pods/8f249265-748f-4a9d-a23c-7bd70a62b669/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.073714 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="930d2540-f85c-425f-8750-75ceb6d183b7" path="/var/lib/kubelet/pods/930d2540-f85c-425f-8750-75ceb6d183b7/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.075206 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da0c545-5faf-43e4-afbb-f016c457a9e0" path="/var/lib/kubelet/pods/9da0c545-5faf-43e4-afbb-f016c457a9e0/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.077315 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-ovsdb-rundir\") pod \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.077377 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-config\") pod \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.077420 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-combined-ca-bundle\") pod \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.077881 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxgdr\" (UniqueName: \"kubernetes.io/projected/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-kube-api-access-hxgdr\") pod \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.077994 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-ovsdbserver-nb-tls-certs\") pod \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.078032 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.078057 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-scripts\") pod \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.078081 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-metrics-certs-tls-certs\") pod \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\" (UID: \"ec9ebffe-8c04-481b-a187-bcdcca1a49a9\") " Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.078960 4941 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.078976 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.079441 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2dc5023-3f28-4e34-be5f-bc3f59188e0b" path="/var/lib/kubelet/pods/a2dc5023-3f28-4e34-be5f-bc3f59188e0b/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.080063 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b030b241-21f3-48a4-88de-c63abeddccb1" path="/var/lib/kubelet/pods/b030b241-21f3-48a4-88de-c63abeddccb1/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.080788 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7bd51e4-e60f-457c-b0da-2d08369daf3c" path="/var/lib/kubelet/pods/e7bd51e4-e60f-457c-b0da-2d08369daf3c/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.081895 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f" path="/var/lib/kubelet/pods/e96bc9b1-d1eb-4f10-a9df-9bcbf43d897f/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.083015 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "ec9ebffe-8c04-481b-a187-bcdcca1a49a9" (UID: "ec9ebffe-8c04-481b-a187-bcdcca1a49a9"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.083656 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-config" (OuterVolumeSpecName: "config") pod "ec9ebffe-8c04-481b-a187-bcdcca1a49a9" (UID: "ec9ebffe-8c04-481b-a187-bcdcca1a49a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.092388 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-scripts" (OuterVolumeSpecName: "scripts") pod "ec9ebffe-8c04-481b-a187-bcdcca1a49a9" (UID: "ec9ebffe-8c04-481b-a187-bcdcca1a49a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.096732 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f561ebac-b036-4d82-8e7a-2a43b031c0ba" path="/var/lib/kubelet/pods/f561ebac-b036-4d82-8e7a-2a43b031c0ba/volumes" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.097527 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ccdd50-0997-4e6e-9e05-3555379221a0-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "88ccdd50-0997-4e6e-9e05-3555379221a0" (UID: "88ccdd50-0997-4e6e-9e05-3555379221a0"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.124565 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-kube-api-access-hxgdr" (OuterVolumeSpecName: "kube-api-access-hxgdr") pod "ec9ebffe-8c04-481b-a187-bcdcca1a49a9" (UID: "ec9ebffe-8c04-481b-a187-bcdcca1a49a9"). InnerVolumeSpecName "kube-api-access-hxgdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.128958 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "ec9ebffe-8c04-481b-a187-bcdcca1a49a9" (UID: "ec9ebffe-8c04-481b-a187-bcdcca1a49a9"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.129475 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f4f0d58-e159-427f-8cca-95525d4968cd-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "5f4f0d58-e159-427f-8cca-95525d4968cd" (UID: "5f4f0d58-e159-427f-8cca-95525d4968cd"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.150121 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1198c32e-6783-497e-a232-5dd01865ecfd" (UID: "1198c32e-6783-497e-a232-5dd01865ecfd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.158604 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-config-data" (OuterVolumeSpecName: "config-data") pod "317acc48-d39a-4c99-8a4e-ef91b0fc3894" (UID: "317acc48-d39a-4c99-8a4e-ef91b0fc3894"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.171205 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="94f7fa69e0150f7b0164f8a024d5ff0ff408147eb7732aa59d194581d6174384" exitCode=0 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.171240 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="dde308e8b7e70259a3c26190f8e91532136f44b834cef40a52f1a0a3750a50ef" exitCode=0 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.173551 4941 generic.go:334] "Generic (PLEG): container finished" podID="99ca3e53-9ebc-464c-ac37-51163b9bc104" containerID="1a1116177ff225faade78191d5e77b34a374f65348259ab6e65f258ca8274368" exitCode=1 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.174079 4941 scope.go:117] "RemoveContainer" containerID="1a1116177ff225faade78191d5e77b34a374f65348259ab6e65f258ca8274368" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.177756 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.183222 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.183252 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.183266 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.183278 4941 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f4f0d58-e159-427f-8cca-95525d4968cd-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.183289 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxgdr\" (UniqueName: \"kubernetes.io/projected/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-kube-api-access-hxgdr\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.183300 4941 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88ccdd50-0997-4e6e-9e05-3555379221a0-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.183312 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.183344 4941 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.183357 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.201213 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1198c32e-6783-497e-a232-5dd01865ecfd" (UID: "1198c32e-6783-497e-a232-5dd01865ecfd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.212469 4941 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.222428 4941 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:17:12 crc kubenswrapper[4941]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: if [ -n "nova_api" ]; then Mar 07 07:17:12 crc kubenswrapper[4941]: GRANT_DATABASE="nova_api" Mar 07 07:17:12 crc kubenswrapper[4941]: else Mar 07 07:17:12 crc kubenswrapper[4941]: GRANT_DATABASE="*" Mar 07 07:17:12 crc kubenswrapper[4941]: fi Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: # going for maximum compatibility here: Mar 07 07:17:12 crc kubenswrapper[4941]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 07 07:17:12 crc kubenswrapper[4941]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 07 07:17:12 crc kubenswrapper[4941]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 07 07:17:12 crc kubenswrapper[4941]: # support updates Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: $MYSQL_CMD < logger="UnhandledError" Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.223618 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-427c-account-create-update-f84ls" podUID="6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.224973 4941 generic.go:334] "Generic (PLEG): container finished" podID="927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" containerID="b048a7900d858225d4830b842a8b3ed5f22a79a6146d7e6677222d62c98ef913" exitCode=143 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.232199 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.248618 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1198c32e-6783-497e-a232-5dd01865ecfd" (UID: "1198c32e-6783-497e-a232-5dd01865ecfd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.251796 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x7fq9" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.256507 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec9ebffe-8c04-481b-a187-bcdcca1a49a9" (UID: "ec9ebffe-8c04-481b-a187-bcdcca1a49a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.261392 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "317acc48-d39a-4c99-8a4e-ef91b0fc3894" (UID: "317acc48-d39a-4c99-8a4e-ef91b0fc3894"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.264310 4941 generic.go:334] "Generic (PLEG): container finished" podID="757b037d-b7b8-4690-93b9-ec85c5bf82db" containerID="68a7bfdd9a6a2bbbe4e0f5d3c955d8f816e0ef8b87e76bd424b8caeec2a73bc3" exitCode=143 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.275812 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_32aba6f1-c08f-4826-8492-9f2979275f5e/ovsdbserver-sb/0.log" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.275927 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.281432 4941 generic.go:334] "Generic (PLEG): container finished" podID="a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" containerID="e0bc59efb00016de1cf1533c35c568041ed4eee0c8728d1661b2d85f7921e10f" exitCode=143 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.285076 4941 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.285102 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.285113 4941 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.285121 4941 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.285128 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317acc48-d39a-4c99-8a4e-ef91b0fc3894-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.288001 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-config" (OuterVolumeSpecName: "config") pod "1198c32e-6783-497e-a232-5dd01865ecfd" (UID: "1198c32e-6783-497e-a232-5dd01865ecfd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.289295 4941 generic.go:334] "Generic (PLEG): container finished" podID="c892cbf7-126c-4638-854d-18cef63c7747" containerID="bbeb9946a442f98e015fb840554d04da471185075dcd4f2981db1c33d0175b7d" exitCode=143 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.294626 4941 generic.go:334] "Generic (PLEG): container finished" podID="ea4583b7-29d7-466d-8c3d-ad9981ebc66d" containerID="3303b50f68a521a17ce6ea4bdfdd7f14e530a5feea2f3c7f904b8d9a9f946eb3" exitCode=143 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.298513 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ec9ebffe-8c04-481b-a187-bcdcca1a49a9/ovsdbserver-nb/0.log" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.298954 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.304328 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "32aba6f1-c08f-4826-8492-9f2979275f5e" (UID: "32aba6f1-c08f-4826-8492-9f2979275f5e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.308668 4941 generic.go:334] "Generic (PLEG): container finished" podID="ad2b6a75-839f-4fec-9f12-fb520b44c7ce" containerID="f8fe04edd08619bd4118ac72769de11cb1dc8824ae2dfa0799d60d9d7cab0731" exitCode=143 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.322243 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "ec9ebffe-8c04-481b-a187-bcdcca1a49a9" (UID: "ec9ebffe-8c04-481b-a187-bcdcca1a49a9"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.323792 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8fkwt_88ccdd50-0997-4e6e-9e05-3555379221a0/openstack-network-exporter/0.log" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.323952 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8fkwt" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.346202 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "32aba6f1-c08f-4826-8492-9f2979275f5e" (UID: "32aba6f1-c08f-4826-8492-9f2979275f5e"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.354662 4941 generic.go:334] "Generic (PLEG): container finished" podID="e27683db-592f-485a-93b3-93273e1644c3" containerID="5ed5ef6d5be2531a2b10e8ac859bcb1d8a560dd65521a1b989c82ce3f9e87c02" exitCode=143 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.363097 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1198c32e-6783-497e-a232-5dd01865ecfd" (UID: "1198c32e-6783-497e-a232-5dd01865ecfd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.386608 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "ec9ebffe-8c04-481b-a187-bcdcca1a49a9" (UID: "ec9ebffe-8c04-481b-a187-bcdcca1a49a9"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.388138 4941 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.388164 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/32aba6f1-c08f-4826-8492-9f2979275f5e-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.388177 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.388191 4941 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9ebffe-8c04-481b-a187-bcdcca1a49a9-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.388203 4941 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.388215 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1198c32e-6783-497e-a232-5dd01865ecfd-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.389969 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"94f7fa69e0150f7b0164f8a024d5ff0ff408147eb7732aa59d194581d6174384"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390013 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"dde308e8b7e70259a3c26190f8e91532136f44b834cef40a52f1a0a3750a50ef"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390026 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-427c-account-create-update-f84ls"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390046 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390058 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jjz5s" event={"ID":"99ca3e53-9ebc-464c-ac37-51163b9bc104","Type":"ContainerDied","Data":"1a1116177ff225faade78191d5e77b34a374f65348259ab6e65f258ca8274368"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390069 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jjz5s" event={"ID":"99ca3e53-9ebc-464c-ac37-51163b9bc104","Type":"ContainerStarted","Data":"6a52803a97ef3a4bbc312cab4b76fa6a183fd74f73e38061dfe093201b1abb7f"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390078 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jjz5s"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390100 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1815-account-create-update-8g7ft" event={"ID":"79039479-0c9b-4931-8d9a-84271be3fee5","Type":"ContainerStarted","Data":"dba910feddd0370ce02a6f498504191c2d3eb4b0bc9b581bbc89885d03327a9a"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390113 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390127 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5379-account-create-update-6728g" event={"ID":"53e374be-8342-42ac-a82a-75854d38e098","Type":"ContainerStarted","Data":"b97c7c8120c5aa2e7308bd3ec643f9e3b2594fa3d63d170685e85283a383c0c7"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390138 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-427c-account-create-update-f84ls" event={"ID":"6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d","Type":"ContainerStarted","Data":"239582e2daad26e1c830b6245399fd7d7458f9e3278085ea38b139c297ff1695"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390147 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7","Type":"ContainerDied","Data":"b048a7900d858225d4830b842a8b3ed5f22a79a6146d7e6677222d62c98ef913"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390157 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n4cfj"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390168 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"317acc48-d39a-4c99-8a4e-ef91b0fc3894","Type":"ContainerDied","Data":"9d00ca63bc8b899319b7b54870f0e72ce188d9a890bf5cc3b4845079d5e44aa7"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390182 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x7fq9" event={"ID":"5f4f0d58-e159-427f-8cca-95525d4968cd","Type":"ContainerDied","Data":"b5bcf5103d089cc5d3fe178719c8c36baaced33e41f54fc4d1efd0a62b22182b"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390194 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n4cfj"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390207 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54977b5b64-bxjq6" event={"ID":"757b037d-b7b8-4690-93b9-ec85c5bf82db","Type":"ContainerDied","Data":"68a7bfdd9a6a2bbbe4e0f5d3c955d8f816e0ef8b87e76bd424b8caeec2a73bc3"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390217 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8lhkg"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390231 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1a10-account-create-update-crrkr" event={"ID":"6a1c3983-6c5e-48af-95cf-5f9536835f8d","Type":"ContainerStarted","Data":"e80a4bca9b8e2124ab4d22aad7f26cc4d0f7ef6cc33b70bb2317e85a4f8122d9"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390240 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390253 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8lhkg"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390264 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"32aba6f1-c08f-4826-8492-9f2979275f5e","Type":"ContainerDied","Data":"fcafe69f9d33cdcdd767b891dba420467db99d4b7347ab7c0732bdb8bd09d45a"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390282 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456","Type":"ContainerDied","Data":"e0bc59efb00016de1cf1533c35c568041ed4eee0c8728d1661b2d85f7921e10f"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390294 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1815-account-create-update-8g7ft"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390305 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-427c-account-create-update-f84ls"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390316 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c892cbf7-126c-4638-854d-18cef63c7747","Type":"ContainerDied","Data":"bbeb9946a442f98e015fb840554d04da471185075dcd4f2981db1c33d0175b7d"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390328 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" event={"ID":"ea4583b7-29d7-466d-8c3d-ad9981ebc66d","Type":"ContainerDied","Data":"3303b50f68a521a17ce6ea4bdfdd7f14e530a5feea2f3c7f904b8d9a9f946eb3"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390339 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1a10-account-create-update-crrkr"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390347 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-5379-account-create-update-6728g"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390356 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ec9ebffe-8c04-481b-a187-bcdcca1a49a9","Type":"ContainerDied","Data":"36f36c5e43282180ca4da99230a4eda143f25603a3890cbfb7db330759e9942b"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390370 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-685ff95674-ldzd4" event={"ID":"ad2b6a75-839f-4fec-9f12-fb520b44c7ce","Type":"ContainerDied","Data":"f8fe04edd08619bd4118ac72769de11cb1dc8824ae2dfa0799d60d9d7cab0731"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390380 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5598777fd7-9fgcl"] Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390391 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8fkwt" event={"ID":"88ccdd50-0997-4e6e-9e05-3555379221a0","Type":"ContainerDied","Data":"10439a9599cfed15257dfcbf32bc6d497a5c9f11ef5a7d790c825d44d56f77ba"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390427 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57855ff457-mshjt" event={"ID":"e27683db-592f-485a-93b3-93273e1644c3","Type":"ContainerDied","Data":"5ed5ef6d5be2531a2b10e8ac859bcb1d8a560dd65521a1b989c82ce3f9e87c02"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390623 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5598777fd7-9fgcl" podUID="7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" containerName="proxy-httpd" containerID="cri-o://2c0681e10d442ad2f1f0fc4b809613e3cdacd8d007ff20ae0809c479b09094cb" gracePeriod=30 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390812 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="056debcc-d271-4ea1-a70c-fc67794f060e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d6c2f62c9103f19083c550098257ae768e69623eab5370a23a6b39d03261c98b" gracePeriod=30 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390875 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5598777fd7-9fgcl" podUID="7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" containerName="proxy-server" containerID="cri-o://cfcc8787654437566381b6642741214a6b8bd5d86a6e869ab2659ad125818269" gracePeriod=30 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.390884 4941 scope.go:117] "RemoveContainer" containerID="750b1358dd25ebdb02604816fc6ce9f0509325964bef5a9bdfbf090a38266760" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.391154 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="943d63f3-758d-4884-8086-93defd44f58a" containerName="nova-cell1-conductor-conductor" containerID="cri-o://272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97" gracePeriod=30 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.391348 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="6da1ac3d-bac5-4c8e-a920-6b6dff25fd20" containerName="nova-cell0-conductor-conductor" containerID="cri-o://d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e" gracePeriod=30 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.392964 4941 generic.go:334] "Generic (PLEG): container finished" podID="d3cb3645-4e27-450f-a712-f656dfa9e8e1" containerID="02af86707f926f63b94a89e930e5bad157f8952cf5491dcebdef8e9862f1f39e" exitCode=0 Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.393049 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c6df5b777-qhsgz" event={"ID":"d3cb3645-4e27-450f-a712-f656dfa9e8e1","Type":"ContainerDied","Data":"02af86707f926f63b94a89e930e5bad157f8952cf5491dcebdef8e9862f1f39e"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.413345 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" event={"ID":"1198c32e-6783-497e-a232-5dd01865ecfd","Type":"ContainerDied","Data":"167e36efc1cb87a7e1b8c0a627ff3357c9836da8620dd3d881931aab794b18b0"} Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.413536 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd56c4d5-xmpfq" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.473635 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3b99-account-create-update-mzq6w"] Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.549507 4941 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.549595 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data podName:3963d293-d9e9-44b6-b0a5-b1532b4a0a31 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:16.549576815 +0000 UTC m=+1533.501942280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data") pod "rabbitmq-server-0" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31") : configmap "rabbitmq-config-data" not found Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.589662 4941 scope.go:117] "RemoveContainer" containerID="4e90b5427d15f1e301d7820993316b94e70b1e5e57e33af40b7531f4506658b7" Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.597822 4941 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:17:12 crc kubenswrapper[4941]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: if [ -n "nova_cell0" ]; then Mar 07 07:17:12 crc kubenswrapper[4941]: GRANT_DATABASE="nova_cell0" Mar 07 07:17:12 crc kubenswrapper[4941]: else Mar 07 07:17:12 crc kubenswrapper[4941]: GRANT_DATABASE="*" Mar 07 07:17:12 crc kubenswrapper[4941]: fi Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: # going for maximum compatibility here: Mar 07 07:17:12 crc kubenswrapper[4941]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 07 07:17:12 crc kubenswrapper[4941]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 07 07:17:12 crc kubenswrapper[4941]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 07 07:17:12 crc kubenswrapper[4941]: # support updates Mar 07 07:17:12 crc kubenswrapper[4941]: Mar 07 07:17:12 crc kubenswrapper[4941]: $MYSQL_CMD < logger="UnhandledError" Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.604494 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-3b99-account-create-update-mzq6w" podUID="b055e9de-2e86-467d-9e93-8fd06977cc87" Mar 07 07:17:12 crc kubenswrapper[4941]: I0307 07:17:12.895249 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="056debcc-d271-4ea1-a70c-fc67794f060e" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.201:6080/vnc_lite.html\": dial tcp 10.217.0.201:6080: connect: connection refused" Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.934363 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.936681 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.938898 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.938952 4941 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="943d63f3-758d-4884-8086-93defd44f58a" containerName="nova-cell1-conductor-conductor" Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.957241 4941 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 07 07:17:12 crc kubenswrapper[4941]: E0307 07:17:12.957323 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data podName:aeb1dd04-5b8c-49b4-bf65-be38fb8ae670 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:16.957304575 +0000 UTC m=+1533.909670040 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data") pod "rabbitmq-cell1-server-0" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670") : configmap "rabbitmq-cell1-config-data" not found Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.286756 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5379-account-create-update-6728g" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.290029 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-x7fq9"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.296039 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-x7fq9"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.317760 4941 scope.go:117] "RemoveContainer" containerID="1a0ea8a8f3c822cadf12d5e4208a3d4ccded7ff8311a82b01edce4e26bce47c2" Mar 07 07:17:13 crc kubenswrapper[4941]: E0307 07:17:13.318019 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd is running failed: container process not found" containerID="ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 07:17:13 crc kubenswrapper[4941]: E0307 07:17:13.318771 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd is running failed: container process not found" containerID="ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 07:17:13 crc kubenswrapper[4941]: E0307 07:17:13.320030 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd is running failed: container process not found" containerID="ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 07:17:13 crc kubenswrapper[4941]: E0307 07:17:13.320057 4941 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="05a91fa3-14f1-4d15-bdfc-bb1fc310a913" containerName="nova-scheduler-scheduler" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.334146 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1815-account-create-update-8g7ft" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.350989 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dd56c4d5-xmpfq"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.362595 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9dd56c4d5-xmpfq"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.369448 4941 scope.go:117] "RemoveContainer" containerID="4d3c4f1cb725db64e33869898d4504671134f70b853278179133c55e028ed533" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.369593 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a10-account-create-update-crrkr" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.372956 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhxsf\" (UniqueName: \"kubernetes.io/projected/79039479-0c9b-4931-8d9a-84271be3fee5-kube-api-access-mhxsf\") pod \"79039479-0c9b-4931-8d9a-84271be3fee5\" (UID: \"79039479-0c9b-4931-8d9a-84271be3fee5\") " Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.373015 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53e374be-8342-42ac-a82a-75854d38e098-operator-scripts\") pod \"53e374be-8342-42ac-a82a-75854d38e098\" (UID: \"53e374be-8342-42ac-a82a-75854d38e098\") " Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.373045 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prrs5\" (UniqueName: \"kubernetes.io/projected/53e374be-8342-42ac-a82a-75854d38e098-kube-api-access-prrs5\") pod \"53e374be-8342-42ac-a82a-75854d38e098\" (UID: \"53e374be-8342-42ac-a82a-75854d38e098\") " Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.373087 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79039479-0c9b-4931-8d9a-84271be3fee5-operator-scripts\") pod \"79039479-0c9b-4931-8d9a-84271be3fee5\" (UID: \"79039479-0c9b-4931-8d9a-84271be3fee5\") " Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.374382 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53e374be-8342-42ac-a82a-75854d38e098-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53e374be-8342-42ac-a82a-75854d38e098" (UID: "53e374be-8342-42ac-a82a-75854d38e098"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.375148 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79039479-0c9b-4931-8d9a-84271be3fee5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "79039479-0c9b-4931-8d9a-84271be3fee5" (UID: "79039479-0c9b-4931-8d9a-84271be3fee5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.392416 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.392597 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79039479-0c9b-4931-8d9a-84271be3fee5-kube-api-access-mhxsf" (OuterVolumeSpecName: "kube-api-access-mhxsf") pod "79039479-0c9b-4931-8d9a-84271be3fee5" (UID: "79039479-0c9b-4931-8d9a-84271be3fee5"). InnerVolumeSpecName "kube-api-access-mhxsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.397745 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.406951 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e374be-8342-42ac-a82a-75854d38e098-kube-api-access-prrs5" (OuterVolumeSpecName: "kube-api-access-prrs5") pod "53e374be-8342-42ac-a82a-75854d38e098" (UID: "53e374be-8342-42ac-a82a-75854d38e098"). InnerVolumeSpecName "kube-api-access-prrs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.411552 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.428387 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-8fkwt"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.440552 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-8fkwt"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.476033 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.476774 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a1c3983-6c5e-48af-95cf-5f9536835f8d-operator-scripts\") pod \"6a1c3983-6c5e-48af-95cf-5f9536835f8d\" (UID: \"6a1c3983-6c5e-48af-95cf-5f9536835f8d\") " Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.476805 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqpj9\" (UniqueName: \"kubernetes.io/projected/6a1c3983-6c5e-48af-95cf-5f9536835f8d-kube-api-access-tqpj9\") pod \"6a1c3983-6c5e-48af-95cf-5f9536835f8d\" (UID: \"6a1c3983-6c5e-48af-95cf-5f9536835f8d\") " Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.476849 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-config-data\") pod \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\" (UID: \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\") " Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.476870 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-combined-ca-bundle\") pod \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\" (UID: \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\") " Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.476952 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwmdm\" (UniqueName: \"kubernetes.io/projected/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-kube-api-access-hwmdm\") pod \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\" (UID: \"05a91fa3-14f1-4d15-bdfc-bb1fc310a913\") " Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.477294 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhxsf\" (UniqueName: \"kubernetes.io/projected/79039479-0c9b-4931-8d9a-84271be3fee5-kube-api-access-mhxsf\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.477305 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53e374be-8342-42ac-a82a-75854d38e098-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.477314 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prrs5\" (UniqueName: \"kubernetes.io/projected/53e374be-8342-42ac-a82a-75854d38e098-kube-api-access-prrs5\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.477347 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79039479-0c9b-4931-8d9a-84271be3fee5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.479367 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a1c3983-6c5e-48af-95cf-5f9536835f8d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6a1c3983-6c5e-48af-95cf-5f9536835f8d" (UID: "6a1c3983-6c5e-48af-95cf-5f9536835f8d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:13 crc kubenswrapper[4941]: E0307 07:17:13.481014 4941 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ccdd50_0997_4e6e_9e05_3555379221a0.slice/crio-10439a9599cfed15257dfcbf32bc6d497a5c9f11ef5a7d790c825d44d56f77ba\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32aba6f1_c08f_4826_8492_9f2979275f5e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1198c32e_6783_497e_a232_5dd01865ecfd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ccdd50_0997_4e6e_9e05_3555379221a0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1198c32e_6783_497e_a232_5dd01865ecfd.slice/crio-167e36efc1cb87a7e1b8c0a627ff3357c9836da8620dd3d881931aab794b18b0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec9ebffe_8c04_481b_a187_bcdcca1a49a9.slice\": RecentStats: unable to find data in memory cache]" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.498877 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.514164 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"056debcc-d271-4ea1-a70c-fc67794f060e","Type":"ContainerDied","Data":"d6c2f62c9103f19083c550098257ae768e69623eab5370a23a6b39d03261c98b"} Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.514210 4941 generic.go:334] "Generic (PLEG): container finished" podID="056debcc-d271-4ea1-a70c-fc67794f060e" containerID="d6c2f62c9103f19083c550098257ae768e69623eab5370a23a6b39d03261c98b" exitCode=0 Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.536574 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-kube-api-access-hwmdm" (OuterVolumeSpecName: "kube-api-access-hwmdm") pod "05a91fa3-14f1-4d15-bdfc-bb1fc310a913" (UID: "05a91fa3-14f1-4d15-bdfc-bb1fc310a913"). InnerVolumeSpecName "kube-api-access-hwmdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.555664 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1c3983-6c5e-48af-95cf-5f9536835f8d-kube-api-access-tqpj9" (OuterVolumeSpecName: "kube-api-access-tqpj9") pod "6a1c3983-6c5e-48af-95cf-5f9536835f8d" (UID: "6a1c3983-6c5e-48af-95cf-5f9536835f8d"). InnerVolumeSpecName "kube-api-access-tqpj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.557834 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1a10-account-create-update-crrkr" event={"ID":"6a1c3983-6c5e-48af-95cf-5f9536835f8d","Type":"ContainerDied","Data":"e80a4bca9b8e2124ab4d22aad7f26cc4d0f7ef6cc33b70bb2317e85a4f8122d9"} Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.557929 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a10-account-create-update-crrkr" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.561055 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b99-account-create-update-mzq6w" event={"ID":"b055e9de-2e86-467d-9e93-8fd06977cc87","Type":"ContainerStarted","Data":"b236cb6a0d3cc1f45fcd598a7da06b911fb80182bb1b2d8dc745b6304358f056"} Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.563202 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1815-account-create-update-8g7ft" event={"ID":"79039479-0c9b-4931-8d9a-84271be3fee5","Type":"ContainerDied","Data":"dba910feddd0370ce02a6f498504191c2d3eb4b0bc9b581bbc89885d03327a9a"} Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.563258 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1815-account-create-update-8g7ft" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.584844 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwmdm\" (UniqueName: \"kubernetes.io/projected/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-kube-api-access-hwmdm\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.584867 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a1c3983-6c5e-48af-95cf-5f9536835f8d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.584876 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqpj9\" (UniqueName: \"kubernetes.io/projected/6a1c3983-6c5e-48af-95cf-5f9536835f8d-kube-api-access-tqpj9\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.593025 4941 generic.go:334] "Generic (PLEG): container finished" podID="7b306e38-c479-45ff-93ab-ca0e0e6a3aef" containerID="4b79935f1592e509f7cf336d7d435662bd63d7104ffb2d6d1825034c8387696c" exitCode=0 Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.593095 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7b306e38-c479-45ff-93ab-ca0e0e6a3aef","Type":"ContainerDied","Data":"4b79935f1592e509f7cf336d7d435662bd63d7104ffb2d6d1825034c8387696c"} Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.595891 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5379-account-create-update-6728g" event={"ID":"53e374be-8342-42ac-a82a-75854d38e098","Type":"ContainerDied","Data":"b97c7c8120c5aa2e7308bd3ec643f9e3b2594fa3d63d170685e85283a383c0c7"} Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.596040 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5379-account-create-update-6728g" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.600921 4941 generic.go:334] "Generic (PLEG): container finished" podID="7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" containerID="cfcc8787654437566381b6642741214a6b8bd5d86a6e869ab2659ad125818269" exitCode=0 Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.600946 4941 generic.go:334] "Generic (PLEG): container finished" podID="7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" containerID="2c0681e10d442ad2f1f0fc4b809613e3cdacd8d007ff20ae0809c479b09094cb" exitCode=0 Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.601015 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5598777fd7-9fgcl" event={"ID":"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510","Type":"ContainerDied","Data":"cfcc8787654437566381b6642741214a6b8bd5d86a6e869ab2659ad125818269"} Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.601060 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5598777fd7-9fgcl" event={"ID":"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510","Type":"ContainerDied","Data":"2c0681e10d442ad2f1f0fc4b809613e3cdacd8d007ff20ae0809c479b09094cb"} Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.601973 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05a91fa3-14f1-4d15-bdfc-bb1fc310a913" (UID: "05a91fa3-14f1-4d15-bdfc-bb1fc310a913"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.604440 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.604589 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-config-data" (OuterVolumeSpecName: "config-data") pod "05a91fa3-14f1-4d15-bdfc-bb1fc310a913" (UID: "05a91fa3-14f1-4d15-bdfc-bb1fc310a913"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.617447 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="753c78f9-47e6-4098-91fa-9adac0997ba4" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.170:8776/healthcheck\": read tcp 10.217.0.2:48420->10.217.0.170:8776: read: connection reset by peer" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.617769 4941 scope.go:117] "RemoveContainer" containerID="067c949e419323cc676040b4fd78ae141b059399cfde952987dfd044a128e4f0" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.618217 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.619368 4941 generic.go:334] "Generic (PLEG): container finished" podID="05a91fa3-14f1-4d15-bdfc-bb1fc310a913" containerID="ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd" exitCode=0 Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.619550 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.626522 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05a91fa3-14f1-4d15-bdfc-bb1fc310a913","Type":"ContainerDied","Data":"ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd"} Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.626594 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.626623 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05a91fa3-14f1-4d15-bdfc-bb1fc310a913","Type":"ContainerDied","Data":"1fcf3ac7428cb9807e710cd2d12bcc87bfcc402eadda52d1e20b5646b848c350"} Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.691557 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.703909 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a91fa3-14f1-4d15-bdfc-bb1fc310a913-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.709678 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.709948 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="ceilometer-central-agent" containerID="cri-o://6ea28da4f040b551e0f722f361b7d4996299e44365e98a82b4392308db4b8494" gracePeriod=30 Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.710077 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="proxy-httpd" containerID="cri-o://7b499e320c5a54cf02f4961762e570d445ed619170c6bc4f7f796075d747290d" gracePeriod=30 Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.710112 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="sg-core" containerID="cri-o://340bf7be6f7110cc16013f4c5fec8c41a77df854bfe6612ef7ea3b858a27fa5f" gracePeriod=30 Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.710142 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="ceilometer-notification-agent" containerID="cri-o://24d5c64686bdb3b51a90c78d352242e68fa608bef152f77db898dddb78032f3a" gracePeriod=30 Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.746485 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.746684 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="74c4b049-d672-41e8-b3cb-09b800f04a19" containerName="kube-state-metrics" containerID="cri-o://d0ced7486dfed1220f94f3b911f9652e0c4769872e3285a1108e1beb5bef597b" gracePeriod=30 Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.807023 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-config-data\") pod \"056debcc-d271-4ea1-a70c-fc67794f060e\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.807061 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-vencrypt-tls-certs\") pod \"056debcc-d271-4ea1-a70c-fc67794f060e\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.807262 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-786b9\" (UniqueName: \"kubernetes.io/projected/056debcc-d271-4ea1-a70c-fc67794f060e-kube-api-access-786b9\") pod \"056debcc-d271-4ea1-a70c-fc67794f060e\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.807284 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-nova-novncproxy-tls-certs\") pod \"056debcc-d271-4ea1-a70c-fc67794f060e\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " Mar 07 07:17:13 crc kubenswrapper[4941]: I0307 07:17:13.807313 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-combined-ca-bundle\") pod \"056debcc-d271-4ea1-a70c-fc67794f060e\" (UID: \"056debcc-d271-4ea1-a70c-fc67794f060e\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:13.845735 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/056debcc-d271-4ea1-a70c-fc67794f060e-kube-api-access-786b9" (OuterVolumeSpecName: "kube-api-access-786b9") pod "056debcc-d271-4ea1-a70c-fc67794f060e" (UID: "056debcc-d271-4ea1-a70c-fc67794f060e"). InnerVolumeSpecName "kube-api-access-786b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:13.863555 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-582b-account-create-update-c9h8k"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:13.890174 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7c6df5b777-qhsgz" podUID="d3cb3645-4e27-450f-a712-f656dfa9e8e1" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.160:9696/\": dial tcp 10.217.0.160:9696: connect: connection refused" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:13.890588 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-config-data" (OuterVolumeSpecName: "config-data") pod "056debcc-d271-4ea1-a70c-fc67794f060e" (UID: "056debcc-d271-4ea1-a70c-fc67794f060e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:13.922819 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-786b9\" (UniqueName: \"kubernetes.io/projected/056debcc-d271-4ea1-a70c-fc67794f060e-kube-api-access-786b9\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:13.922854 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:13.924041 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:13.930303 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:13.930326 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "056debcc-d271-4ea1-a70c-fc67794f060e" (UID: "056debcc-d271-4ea1-a70c-fc67794f060e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:13.930604 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="5b22d449-d1ec-4bf4-a876-b86a87508580" containerName="memcached" containerID="cri-o://db1a9f99b81bb6ecb54ee4a7546b073acc7305f1c4b4f070a77885466d64e5f6" gracePeriod=30 Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:13.935116 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:13.956357 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:13.956445 4941 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="6da1ac3d-bac5-4c8e-a920-6b6dff25fd20" containerName="nova-cell0-conductor-conductor" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:13.961480 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "056debcc-d271-4ea1-a70c-fc67794f060e" (UID: "056debcc-d271-4ea1-a70c-fc67794f060e"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:13.986802 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "056debcc-d271-4ea1-a70c-fc67794f060e" (UID: "056debcc-d271-4ea1-a70c-fc67794f060e"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.031842 4941 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.031878 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.031891 4941 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/056debcc-d271-4ea1-a70c-fc67794f060e-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.146681 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b79935f1592e509f7cf336d7d435662bd63d7104ffb2d6d1825034c8387696c is running failed: container process not found" containerID="4b79935f1592e509f7cf336d7d435662bd63d7104ffb2d6d1825034c8387696c" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.147415 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b79935f1592e509f7cf336d7d435662bd63d7104ffb2d6d1825034c8387696c is running failed: container process not found" containerID="4b79935f1592e509f7cf336d7d435662bd63d7104ffb2d6d1825034c8387696c" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.147644 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b79935f1592e509f7cf336d7d435662bd63d7104ffb2d6d1825034c8387696c is running failed: container process not found" containerID="4b79935f1592e509f7cf336d7d435662bd63d7104ffb2d6d1825034c8387696c" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.147687 4941 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b79935f1592e509f7cf336d7d435662bd63d7104ffb2d6d1825034c8387696c is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="7b306e38-c479-45ff-93ab-ca0e0e6a3aef" containerName="galera" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.157915 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1198c32e-6783-497e-a232-5dd01865ecfd" path="/var/lib/kubelet/pods/1198c32e-6783-497e-a232-5dd01865ecfd/volumes" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.158911 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d028683-343b-490d-9790-202e64e4e721" path="/var/lib/kubelet/pods/2d028683-343b-490d-9790-202e64e4e721/volumes" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.159524 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="317acc48-d39a-4c99-8a4e-ef91b0fc3894" path="/var/lib/kubelet/pods/317acc48-d39a-4c99-8a4e-ef91b0fc3894/volumes" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.160210 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32aba6f1-c08f-4826-8492-9f2979275f5e" path="/var/lib/kubelet/pods/32aba6f1-c08f-4826-8492-9f2979275f5e/volumes" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.161270 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f4f0d58-e159-427f-8cca-95525d4968cd" path="/var/lib/kubelet/pods/5f4f0d58-e159-427f-8cca-95525d4968cd/volumes" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.161947 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ceb33a6-9365-45ba-99a7-db9a11b3e7ca" path="/var/lib/kubelet/pods/7ceb33a6-9365-45ba-99a7-db9a11b3e7ca/volumes" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.162581 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ccdd50-0997-4e6e-9e05-3555379221a0" path="/var/lib/kubelet/pods/88ccdd50-0997-4e6e-9e05-3555379221a0/volumes" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.163806 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec9ebffe-8c04-481b-a187-bcdcca1a49a9" path="/var/lib/kubelet/pods/ec9ebffe-8c04-481b-a187-bcdcca1a49a9/volumes" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.165090 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-582b-account-create-update-c9h8k"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.165126 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-582b-account-create-update-j28r7"] Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.165841 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a91fa3-14f1-4d15-bdfc-bb1fc310a913" containerName="nova-scheduler-scheduler" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.165864 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a91fa3-14f1-4d15-bdfc-bb1fc310a913" containerName="nova-scheduler-scheduler" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.165884 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ccdd50-0997-4e6e-9e05-3555379221a0" containerName="openstack-network-exporter" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.165893 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ccdd50-0997-4e6e-9e05-3555379221a0" containerName="openstack-network-exporter" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.165912 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9ebffe-8c04-481b-a187-bcdcca1a49a9" containerName="ovsdbserver-nb" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.165922 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9ebffe-8c04-481b-a187-bcdcca1a49a9" containerName="ovsdbserver-nb" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.165932 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317acc48-d39a-4c99-8a4e-ef91b0fc3894" containerName="probe" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.165941 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="317acc48-d39a-4c99-8a4e-ef91b0fc3894" containerName="probe" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.165949 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="056debcc-d271-4ea1-a70c-fc67794f060e" containerName="nova-cell1-novncproxy-novncproxy" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.165957 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="056debcc-d271-4ea1-a70c-fc67794f060e" containerName="nova-cell1-novncproxy-novncproxy" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.165973 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f4f0d58-e159-427f-8cca-95525d4968cd" containerName="ovn-controller" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.165980 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f4f0d58-e159-427f-8cca-95525d4968cd" containerName="ovn-controller" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.166003 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32aba6f1-c08f-4826-8492-9f2979275f5e" containerName="openstack-network-exporter" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166011 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="32aba6f1-c08f-4826-8492-9f2979275f5e" containerName="openstack-network-exporter" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.166024 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317acc48-d39a-4c99-8a4e-ef91b0fc3894" containerName="cinder-scheduler" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166031 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="317acc48-d39a-4c99-8a4e-ef91b0fc3894" containerName="cinder-scheduler" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.166047 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9ebffe-8c04-481b-a187-bcdcca1a49a9" containerName="openstack-network-exporter" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166055 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9ebffe-8c04-481b-a187-bcdcca1a49a9" containerName="openstack-network-exporter" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.166069 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1198c32e-6783-497e-a232-5dd01865ecfd" containerName="init" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166076 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1198c32e-6783-497e-a232-5dd01865ecfd" containerName="init" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.166090 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1198c32e-6783-497e-a232-5dd01865ecfd" containerName="dnsmasq-dns" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166098 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1198c32e-6783-497e-a232-5dd01865ecfd" containerName="dnsmasq-dns" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.166112 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32aba6f1-c08f-4826-8492-9f2979275f5e" containerName="ovsdbserver-sb" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166119 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="32aba6f1-c08f-4826-8492-9f2979275f5e" containerName="ovsdbserver-sb" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166314 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a91fa3-14f1-4d15-bdfc-bb1fc310a913" containerName="nova-scheduler-scheduler" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166330 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f4f0d58-e159-427f-8cca-95525d4968cd" containerName="ovn-controller" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166342 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="32aba6f1-c08f-4826-8492-9f2979275f5e" containerName="ovsdbserver-sb" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166363 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec9ebffe-8c04-481b-a187-bcdcca1a49a9" containerName="ovsdbserver-nb" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166371 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ccdd50-0997-4e6e-9e05-3555379221a0" containerName="openstack-network-exporter" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166382 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="317acc48-d39a-4c99-8a4e-ef91b0fc3894" containerName="cinder-scheduler" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166390 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="317acc48-d39a-4c99-8a4e-ef91b0fc3894" containerName="probe" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166423 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="056debcc-d271-4ea1-a70c-fc67794f060e" containerName="nova-cell1-novncproxy-novncproxy" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166439 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="1198c32e-6783-497e-a232-5dd01865ecfd" containerName="dnsmasq-dns" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166450 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="32aba6f1-c08f-4826-8492-9f2979275f5e" containerName="openstack-network-exporter" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.166461 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec9ebffe-8c04-481b-a187-bcdcca1a49a9" containerName="openstack-network-exporter" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.167123 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-582b-account-create-update-j28r7"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.167145 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-lbdr9"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.167158 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-h62cn"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.167171 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-h62cn"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.167184 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5c79966c-pthdw"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.167216 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-lbdr9"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.167232 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.167244 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-582b-account-create-update-j28r7"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.167255 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-5ft6f"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.167266 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-5ft6f"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.167277 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jjz5s"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.167706 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-582b-account-create-update-j28r7" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.167836 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-5c79966c-pthdw" podUID="b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" containerName="keystone-api" containerID="cri-o://b109f7f57212d8e8c114a9a627a5a319764e1da9f3d67fde616e4969dbfc95a1" gracePeriod=30 Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.455924 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="b1fb4667-396e-44bb-a2ed-e576a9b69be2" containerName="galera" containerID="cri-o://7f685a4108caaa4e36108f8a55c285a49cde2d7ba1b6c62533d0207627b03aa3" gracePeriod=30 Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.653029 4941 generic.go:334] "Generic (PLEG): container finished" podID="74c4b049-d672-41e8-b3cb-09b800f04a19" containerID="d0ced7486dfed1220f94f3b911f9652e0c4769872e3285a1108e1beb5bef597b" exitCode=2 Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.653118 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"74c4b049-d672-41e8-b3cb-09b800f04a19","Type":"ContainerDied","Data":"d0ced7486dfed1220f94f3b911f9652e0c4769872e3285a1108e1beb5bef597b"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.669534 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.669949 4941 generic.go:334] "Generic (PLEG): container finished" podID="ad2b6a75-839f-4fec-9f12-fb520b44c7ce" containerID="fb6516769d261d733fc9be130e2e6aea292a7c5ff2e94299bbabd569bbe859a7" exitCode=0 Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.670080 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-685ff95674-ldzd4" event={"ID":"ad2b6a75-839f-4fec-9f12-fb520b44c7ce","Type":"ContainerDied","Data":"fb6516769d261d733fc9be130e2e6aea292a7c5ff2e94299bbabd569bbe859a7"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.692499 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b99-account-create-update-mzq6w" event={"ID":"b055e9de-2e86-467d-9e93-8fd06977cc87","Type":"ContainerDied","Data":"b236cb6a0d3cc1f45fcd598a7da06b911fb80182bb1b2d8dc745b6304358f056"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.692539 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b236cb6a0d3cc1f45fcd598a7da06b911fb80182bb1b2d8dc745b6304358f056" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.696314 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.696396 4941 generic.go:334] "Generic (PLEG): container finished" podID="927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" containerID="885046a8a3875aba6340cf20fd6d156d9145acc04deb00a3dc3da2bb74d33aa4" exitCode=0 Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.696475 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7","Type":"ContainerDied","Data":"885046a8a3875aba6340cf20fd6d156d9145acc04deb00a3dc3da2bb74d33aa4"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.699420 4941 generic.go:334] "Generic (PLEG): container finished" podID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerID="7b499e320c5a54cf02f4961762e570d445ed619170c6bc4f7f796075d747290d" exitCode=0 Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.699442 4941 generic.go:334] "Generic (PLEG): container finished" podID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerID="340bf7be6f7110cc16013f4c5fec8c41a77df854bfe6612ef7ea3b858a27fa5f" exitCode=2 Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.699450 4941 generic.go:334] "Generic (PLEG): container finished" podID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerID="6ea28da4f040b551e0f722f361b7d4996299e44365e98a82b4392308db4b8494" exitCode=0 Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.699484 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc5e0ad9-b4e5-4307-a381-3a92092a3240","Type":"ContainerDied","Data":"7b499e320c5a54cf02f4961762e570d445ed619170c6bc4f7f796075d747290d"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.699509 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc5e0ad9-b4e5-4307-a381-3a92092a3240","Type":"ContainerDied","Data":"340bf7be6f7110cc16013f4c5fec8c41a77df854bfe6612ef7ea3b858a27fa5f"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.699520 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc5e0ad9-b4e5-4307-a381-3a92092a3240","Type":"ContainerDied","Data":"6ea28da4f040b551e0f722f361b7d4996299e44365e98a82b4392308db4b8494"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.700740 4941 scope.go:117] "RemoveContainer" containerID="be456d1e5f8553ef85166e01f835ac78767170cdd6c5ad60cdfc7756602760a6" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.706719 4941 generic.go:334] "Generic (PLEG): container finished" podID="99ca3e53-9ebc-464c-ac37-51163b9bc104" containerID="0b519bc4136cc105e727716b8acfe9c39c1115c2fb1a9245e1ee243a155acf47" exitCode=1 Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.706789 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jjz5s" event={"ID":"99ca3e53-9ebc-464c-ac37-51163b9bc104","Type":"ContainerDied","Data":"0b519bc4136cc105e727716b8acfe9c39c1115c2fb1a9245e1ee243a155acf47"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.707300 4941 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-jjz5s" secret="" err="secret \"galera-openstack-dockercfg-d4fqc\" not found" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.707332 4941 scope.go:117] "RemoveContainer" containerID="0b519bc4136cc105e727716b8acfe9c39c1115c2fb1a9245e1ee243a155acf47" Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.707768 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-jjz5s_openstack(99ca3e53-9ebc-464c-ac37-51163b9bc104)\"" pod="openstack/root-account-create-update-jjz5s" podUID="99ca3e53-9ebc-464c-ac37-51163b9bc104" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.712014 4941 generic.go:334] "Generic (PLEG): container finished" podID="753c78f9-47e6-4098-91fa-9adac0997ba4" containerID="7309868f4caab95c79325c4137c9791aaa3b778c28a0d6e39b6d6ff175e4b90e" exitCode=0 Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.712093 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"753c78f9-47e6-4098-91fa-9adac0997ba4","Type":"ContainerDied","Data":"7309868f4caab95c79325c4137c9791aaa3b778c28a0d6e39b6d6ff175e4b90e"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.712126 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"753c78f9-47e6-4098-91fa-9adac0997ba4","Type":"ContainerDied","Data":"ddb6dc5958da3cee1b1aef7e9e6302f3ced067cfdeaa1fa3a76fe919de368e89"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.712217 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddb6dc5958da3cee1b1aef7e9e6302f3ced067cfdeaa1fa3a76fe919de368e89" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.713894 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"056debcc-d271-4ea1-a70c-fc67794f060e","Type":"ContainerDied","Data":"5bdda605773738a0c60ba4bd0138471e63090a8ed2166ba138afc3d68b2ba191"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.713950 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.715723 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7b306e38-c479-45ff-93ab-ca0e0e6a3aef","Type":"ContainerDied","Data":"fc62b7217c793fd339905be7b9764f09d0143de2cf0406cb41b1edbc4b3a0fb6"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.715791 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.719661 4941 generic.go:334] "Generic (PLEG): container finished" podID="e6d72c12-422e-48fd-b56b-8344260e3e01" containerID="ed43789861becd87eee81a4232f20de6afb6f8198fc9dd762f6924dee8e81bc0" exitCode=0 Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.720628 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e6d72c12-422e-48fd-b56b-8344260e3e01","Type":"ContainerDied","Data":"ed43789861becd87eee81a4232f20de6afb6f8198fc9dd762f6924dee8e81bc0"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.720710 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e6d72c12-422e-48fd-b56b-8344260e3e01","Type":"ContainerDied","Data":"98678e2d19eceae08efb60bdac32c0da6191c50324dd4cc14b7dbf4a21896eda"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.720722 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98678e2d19eceae08efb60bdac32c0da6191c50324dd4cc14b7dbf4a21896eda" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.724756 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-427c-account-create-update-f84ls" event={"ID":"6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d","Type":"ContainerDied","Data":"239582e2daad26e1c830b6245399fd7d7458f9e3278085ea38b139c297ff1695"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.724796 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="239582e2daad26e1c830b6245399fd7d7458f9e3278085ea38b139c297ff1695" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.736834 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5598777fd7-9fgcl" event={"ID":"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510","Type":"ContainerDied","Data":"bf075523b0769cc58db4e1637f1845844c69fa2ab2dd6316cb6667f9da692f93"} Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.736935 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5598777fd7-9fgcl" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751356 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-combined-ca-bundle\") pod \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751383 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-galera-tls-certs\") pod \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751415 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-config-data\") pod \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751440 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-etc-swift\") pod \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751469 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-config-data-default\") pod \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751505 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751536 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxv7p\" (UniqueName: \"kubernetes.io/projected/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-kube-api-access-rxv7p\") pod \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751553 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-kolla-config\") pod \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751577 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-public-tls-certs\") pod \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751603 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-run-httpd\") pod \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751620 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-internal-tls-certs\") pod \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751643 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-log-httpd\") pod \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751658 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-config-data-generated\") pod \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751675 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-operator-scripts\") pod \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751711 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8t2k\" (UniqueName: \"kubernetes.io/projected/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-kube-api-access-x8t2k\") pod \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\" (UID: \"7b2f75a4-a46a-4430-bf4d-d3c2c65d8510\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.751730 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-combined-ca-bundle\") pod \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\" (UID: \"7b306e38-c479-45ff-93ab-ca0e0e6a3aef\") " Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.752041 4941 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 07 07:17:14 crc kubenswrapper[4941]: E0307 07:17:14.752085 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99ca3e53-9ebc-464c-ac37-51163b9bc104-operator-scripts podName:99ca3e53-9ebc-464c-ac37-51163b9bc104 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:15.252072144 +0000 UTC m=+1532.204437609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/99ca3e53-9ebc-464c-ac37-51163b9bc104-operator-scripts") pod "root-account-create-update-jjz5s" (UID: "99ca3e53-9ebc-464c-ac37-51163b9bc104") : configmap "openstack-scripts" not found Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.760290 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" (UID: "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.762981 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "7b306e38-c479-45ff-93ab-ca0e0e6a3aef" (UID: "7b306e38-c479-45ff-93ab-ca0e0e6a3aef"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.766054 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "7b306e38-c479-45ff-93ab-ca0e0e6a3aef" (UID: "7b306e38-c479-45ff-93ab-ca0e0e6a3aef"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.766277 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" (UID: "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.766475 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b306e38-c479-45ff-93ab-ca0e0e6a3aef" (UID: "7b306e38-c479-45ff-93ab-ca0e0e6a3aef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.770150 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "7b306e38-c479-45ff-93ab-ca0e0e6a3aef" (UID: "7b306e38-c479-45ff-93ab-ca0e0e6a3aef"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.775240 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-kube-api-access-x8t2k" (OuterVolumeSpecName: "kube-api-access-x8t2k") pod "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" (UID: "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510"). InnerVolumeSpecName "kube-api-access-x8t2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.775334 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" (UID: "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.782278 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-kube-api-access-rxv7p" (OuterVolumeSpecName: "kube-api-access-rxv7p") pod "7b306e38-c479-45ff-93ab-ca0e0e6a3aef" (UID: "7b306e38-c479-45ff-93ab-ca0e0e6a3aef"). InnerVolumeSpecName "kube-api-access-rxv7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.796296 4941 scope.go:117] "RemoveContainer" containerID="0fdfa5c28298504762261e59ae8634154d32cea145ca796274ab84812c8beeb1" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.809866 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "7b306e38-c479-45ff-93ab-ca0e0e6a3aef" (UID: "7b306e38-c479-45ff-93ab-ca0e0e6a3aef"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.809969 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-582b-account-create-update-j28r7" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.810121 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-427c-account-create-update-f84ls" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.810369 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b306e38-c479-45ff-93ab-ca0e0e6a3aef" (UID: "7b306e38-c479-45ff-93ab-ca0e0e6a3aef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.866191 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" (UID: "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.869961 4941 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.870006 4941 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.870030 4941 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.873513 4941 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.873532 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxv7p\" (UniqueName: \"kubernetes.io/projected/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-kube-api-access-rxv7p\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.873541 4941 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.873549 4941 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.873559 4941 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.873567 4941 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.873575 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.873584 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8t2k\" (UniqueName: \"kubernetes.io/projected/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-kube-api-access-x8t2k\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.873593 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.875710 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" (UID: "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.880385 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-config-data" (OuterVolumeSpecName: "config-data") pod "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" (UID: "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.881551 4941 scope.go:117] "RemoveContainer" containerID="ea2fc562d127ab81fb4b03a280065325184993b1b4cb50e32728a5bc194bdcad" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.884684 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b99-account-create-update-mzq6w" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.885254 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" (UID: "7b2f75a4-a46a-4430-bf4d-d3c2c65d8510"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.909934 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "7b306e38-c479-45ff-93ab-ca0e0e6a3aef" (UID: "7b306e38-c479-45ff-93ab-ca0e0e6a3aef"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.920305 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1a10-account-create-update-crrkr"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.933203 4941 scope.go:117] "RemoveContainer" containerID="2dd2f8674fc7368ace30b5ccbaa4c590e6795b851ad19939a0b2b02851b97fd8" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.933594 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.938522 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.938629 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.941479 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.954674 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1a10-account-create-update-crrkr"] Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.971005 4941 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.974867 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwxkq\" (UniqueName: \"kubernetes.io/projected/753c78f9-47e6-4098-91fa-9adac0997ba4-kube-api-access-bwxkq\") pod \"753c78f9-47e6-4098-91fa-9adac0997ba4\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.977739 4941 scope.go:117] "RemoveContainer" containerID="5125f9279c7ec81414bff41771650af0c2b88b60d9d5f2957f576c5e9948b646" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.978762 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.978801 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d-operator-scripts\") pod \"6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d\" (UID: \"6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.978827 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6d72c12-422e-48fd-b56b-8344260e3e01-httpd-run\") pod \"e6d72c12-422e-48fd-b56b-8344260e3e01\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.978852 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-internal-tls-certs\") pod \"e6d72c12-422e-48fd-b56b-8344260e3e01\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.978894 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzbp6\" (UniqueName: \"kubernetes.io/projected/6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d-kube-api-access-bzbp6\") pod \"6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d\" (UID: \"6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.978918 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-state-metrics-tls-certs\") pod \"74c4b049-d672-41e8-b3cb-09b800f04a19\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.978954 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-config-data-custom\") pod \"753c78f9-47e6-4098-91fa-9adac0997ba4\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.978980 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-scripts\") pod \"753c78f9-47e6-4098-91fa-9adac0997ba4\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.978999 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-combined-ca-bundle\") pod \"74c4b049-d672-41e8-b3cb-09b800f04a19\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979039 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-public-tls-certs\") pod \"753c78f9-47e6-4098-91fa-9adac0997ba4\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979071 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-httpd-run\") pod \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979091 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-config-data\") pod \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979114 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b055e9de-2e86-467d-9e93-8fd06977cc87-operator-scripts\") pod \"b055e9de-2e86-467d-9e93-8fd06977cc87\" (UID: \"b055e9de-2e86-467d-9e93-8fd06977cc87\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979143 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vft9m\" (UniqueName: \"kubernetes.io/projected/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-kube-api-access-vft9m\") pod \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979171 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-logs\") pod \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979188 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/753c78f9-47e6-4098-91fa-9adac0997ba4-etc-machine-id\") pod \"753c78f9-47e6-4098-91fa-9adac0997ba4\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979204 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-public-tls-certs\") pod \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979220 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-config-data\") pod \"e6d72c12-422e-48fd-b56b-8344260e3e01\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979237 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/753c78f9-47e6-4098-91fa-9adac0997ba4-logs\") pod \"753c78f9-47e6-4098-91fa-9adac0997ba4\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979265 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncks6\" (UniqueName: \"kubernetes.io/projected/e6d72c12-422e-48fd-b56b-8344260e3e01-kube-api-access-ncks6\") pod \"e6d72c12-422e-48fd-b56b-8344260e3e01\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979284 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-scripts\") pod \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979309 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdkg5\" (UniqueName: \"kubernetes.io/projected/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-api-access-wdkg5\") pod \"74c4b049-d672-41e8-b3cb-09b800f04a19\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979331 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-combined-ca-bundle\") pod \"e6d72c12-422e-48fd-b56b-8344260e3e01\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979359 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-state-metrics-tls-config\") pod \"74c4b049-d672-41e8-b3cb-09b800f04a19\" (UID: \"74c4b049-d672-41e8-b3cb-09b800f04a19\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979380 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-scripts\") pod \"e6d72c12-422e-48fd-b56b-8344260e3e01\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979418 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d72c12-422e-48fd-b56b-8344260e3e01-logs\") pod \"e6d72c12-422e-48fd-b56b-8344260e3e01\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979450 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-combined-ca-bundle\") pod \"753c78f9-47e6-4098-91fa-9adac0997ba4\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979465 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-internal-tls-certs\") pod \"753c78f9-47e6-4098-91fa-9adac0997ba4\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979489 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2ztz\" (UniqueName: \"kubernetes.io/projected/b055e9de-2e86-467d-9e93-8fd06977cc87-kube-api-access-g2ztz\") pod \"b055e9de-2e86-467d-9e93-8fd06977cc87\" (UID: \"b055e9de-2e86-467d-9e93-8fd06977cc87\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979508 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e6d72c12-422e-48fd-b56b-8344260e3e01\" (UID: \"e6d72c12-422e-48fd-b56b-8344260e3e01\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979537 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-config-data\") pod \"753c78f9-47e6-4098-91fa-9adac0997ba4\" (UID: \"753c78f9-47e6-4098-91fa-9adac0997ba4\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979559 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-combined-ca-bundle\") pod \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\" (UID: \"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7\") " Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979966 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979985 4941 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b306e38-c479-45ff-93ab-ca0e0e6a3aef-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.979997 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.980009 4941 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.980020 4941 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.980679 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d" (UID: "6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.981539 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b055e9de-2e86-467d-9e93-8fd06977cc87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b055e9de-2e86-467d-9e93-8fd06977cc87" (UID: "b055e9de-2e86-467d-9e93-8fd06977cc87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.982026 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" (UID: "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.982688 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d72c12-422e-48fd-b56b-8344260e3e01-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e6d72c12-422e-48fd-b56b-8344260e3e01" (UID: "e6d72c12-422e-48fd-b56b-8344260e3e01"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.988138 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d72c12-422e-48fd-b56b-8344260e3e01-kube-api-access-ncks6" (OuterVolumeSpecName: "kube-api-access-ncks6") pod "e6d72c12-422e-48fd-b56b-8344260e3e01" (UID: "e6d72c12-422e-48fd-b56b-8344260e3e01"). InnerVolumeSpecName "kube-api-access-ncks6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.988925 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/753c78f9-47e6-4098-91fa-9adac0997ba4-logs" (OuterVolumeSpecName: "logs") pod "753c78f9-47e6-4098-91fa-9adac0997ba4" (UID: "753c78f9-47e6-4098-91fa-9adac0997ba4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:14 crc kubenswrapper[4941]: I0307 07:17:14.993541 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-kube-api-access-vft9m" (OuterVolumeSpecName: "kube-api-access-vft9m") pod "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" (UID: "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7"). InnerVolumeSpecName "kube-api-access-vft9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.000294 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/753c78f9-47e6-4098-91fa-9adac0997ba4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "753c78f9-47e6-4098-91fa-9adac0997ba4" (UID: "753c78f9-47e6-4098-91fa-9adac0997ba4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.002188 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d72c12-422e-48fd-b56b-8344260e3e01-logs" (OuterVolumeSpecName: "logs") pod "e6d72c12-422e-48fd-b56b-8344260e3e01" (UID: "e6d72c12-422e-48fd-b56b-8344260e3e01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.006107 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" (UID: "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.007718 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "753c78f9-47e6-4098-91fa-9adac0997ba4" (UID: "753c78f9-47e6-4098-91fa-9adac0997ba4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.008150 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-scripts" (OuterVolumeSpecName: "scripts") pod "753c78f9-47e6-4098-91fa-9adac0997ba4" (UID: "753c78f9-47e6-4098-91fa-9adac0997ba4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.009509 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-scripts" (OuterVolumeSpecName: "scripts") pod "e6d72c12-422e-48fd-b56b-8344260e3e01" (UID: "e6d72c12-422e-48fd-b56b-8344260e3e01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.011197 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-api-access-wdkg5" (OuterVolumeSpecName: "kube-api-access-wdkg5") pod "74c4b049-d672-41e8-b3cb-09b800f04a19" (UID: "74c4b049-d672-41e8-b3cb-09b800f04a19"). InnerVolumeSpecName "kube-api-access-wdkg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.015815 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753c78f9-47e6-4098-91fa-9adac0997ba4-kube-api-access-bwxkq" (OuterVolumeSpecName: "kube-api-access-bwxkq") pod "753c78f9-47e6-4098-91fa-9adac0997ba4" (UID: "753c78f9-47e6-4098-91fa-9adac0997ba4"). InnerVolumeSpecName "kube-api-access-bwxkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.023912 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "e6d72c12-422e-48fd-b56b-8344260e3e01" (UID: "e6d72c12-422e-48fd-b56b-8344260e3e01"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.026256 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d-kube-api-access-bzbp6" (OuterVolumeSpecName: "kube-api-access-bzbp6") pod "6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d" (UID: "6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d"). InnerVolumeSpecName "kube-api-access-bzbp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.033523 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-logs" (OuterVolumeSpecName: "logs") pod "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" (UID: "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.041802 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.064241 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1815-account-create-update-8g7ft"] Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.092802 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b055e9de-2e86-467d-9e93-8fd06977cc87-kube-api-access-g2ztz" (OuterVolumeSpecName: "kube-api-access-g2ztz") pod "b055e9de-2e86-467d-9e93-8fd06977cc87" (UID: "b055e9de-2e86-467d-9e93-8fd06977cc87"). InnerVolumeSpecName "kube-api-access-g2ztz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.092891 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-scripts" (OuterVolumeSpecName: "scripts") pod "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" (UID: "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.093149 4941 scope.go:117] "RemoveContainer" containerID="60388b14da3f3917c16c54cde22e16752e910c2ccf741bf0069462a58ed79601" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.093145 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c892cbf7-126c-4638-854d-18cef63c7747" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": read tcp 10.217.0.2:59810->10.217.0.210:8775: read: connection reset by peer" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.093353 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c892cbf7-126c-4638-854d-18cef63c7747" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": read tcp 10.217.0.2:59814->10.217.0.210:8775: read: connection reset by peer" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.094232 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-1815-account-create-update-8g7ft"] Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.094265 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.129155 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwxkq\" (UniqueName: \"kubernetes.io/projected/753c78f9-47e6-4098-91fa-9adac0997ba4-kube-api-access-bwxkq\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.129208 4941 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.129315 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.129367 4941 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6d72c12-422e-48fd-b56b-8344260e3e01-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.138615 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzbp6\" (UniqueName: \"kubernetes.io/projected/6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d-kube-api-access-bzbp6\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.138935 4941 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.138966 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.138984 4941 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.139028 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b055e9de-2e86-467d-9e93-8fd06977cc87-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.139037 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vft9m\" (UniqueName: \"kubernetes.io/projected/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-kube-api-access-vft9m\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.139047 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.139063 4941 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/753c78f9-47e6-4098-91fa-9adac0997ba4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.139075 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/753c78f9-47e6-4098-91fa-9adac0997ba4-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.139087 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncks6\" (UniqueName: \"kubernetes.io/projected/e6d72c12-422e-48fd-b56b-8344260e3e01-kube-api-access-ncks6\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.139102 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.139112 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdkg5\" (UniqueName: \"kubernetes.io/projected/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-api-access-wdkg5\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.139122 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.139133 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d72c12-422e-48fd-b56b-8344260e3e01-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.139145 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2ztz\" (UniqueName: \"kubernetes.io/projected/b055e9de-2e86-467d-9e93-8fd06977cc87-kube-api-access-g2ztz\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.139172 4941 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.139502 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.147220 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54977b5b64-bxjq6" podUID="757b037d-b7b8-4690-93b9-ec85c5bf82db" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.171:9311/healthcheck\": read tcp 10.217.0.2:33172->10.217.0.171:9311: read: connection reset by peer" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.147262 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54977b5b64-bxjq6" podUID="757b037d-b7b8-4690-93b9-ec85c5bf82db" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.171:9311/healthcheck\": read tcp 10.217.0.2:33180->10.217.0.171:9311: read: connection reset by peer" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.165440 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6d72c12-422e-48fd-b56b-8344260e3e01" (UID: "e6d72c12-422e-48fd-b56b-8344260e3e01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.193056 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" (UID: "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.215272 4941 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.240205 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-internal-tls-certs\") pod \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.240275 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-scripts\") pod \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.240327 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8m4g\" (UniqueName: \"kubernetes.io/projected/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-kube-api-access-t8m4g\") pod \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.240353 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-config-data\") pod \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.240389 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-combined-ca-bundle\") pod \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.240494 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-public-tls-certs\") pod \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.240542 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-logs\") pod \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\" (UID: \"ad2b6a75-839f-4fec-9f12-fb520b44c7ce\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.240777 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.240789 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.240800 4941 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.241221 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-logs" (OuterVolumeSpecName: "logs") pod "ad2b6a75-839f-4fec-9f12-fb520b44c7ce" (UID: "ad2b6a75-839f-4fec-9f12-fb520b44c7ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.246817 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "753c78f9-47e6-4098-91fa-9adac0997ba4" (UID: "753c78f9-47e6-4098-91fa-9adac0997ba4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.259965 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-scripts" (OuterVolumeSpecName: "scripts") pod "ad2b6a75-839f-4fec-9f12-fb520b44c7ce" (UID: "ad2b6a75-839f-4fec-9f12-fb520b44c7ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.306949 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-kube-api-access-t8m4g" (OuterVolumeSpecName: "kube-api-access-t8m4g") pod "ad2b6a75-839f-4fec-9f12-fb520b44c7ce" (UID: "ad2b6a75-839f-4fec-9f12-fb520b44c7ce"). InnerVolumeSpecName "kube-api-access-t8m4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.307088 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "74c4b049-d672-41e8-b3cb-09b800f04a19" (UID: "74c4b049-d672-41e8-b3cb-09b800f04a19"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.312072 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "753c78f9-47e6-4098-91fa-9adac0997ba4" (UID: "753c78f9-47e6-4098-91fa-9adac0997ba4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.329270 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74c4b049-d672-41e8-b3cb-09b800f04a19" (UID: "74c4b049-d672-41e8-b3cb-09b800f04a19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.344922 4941 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.344991 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99ca3e53-9ebc-464c-ac37-51163b9bc104-operator-scripts podName:99ca3e53-9ebc-464c-ac37-51163b9bc104 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:16.344972624 +0000 UTC m=+1533.297338089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/99ca3e53-9ebc-464c-ac37-51163b9bc104-operator-scripts") pod "root-account-create-update-jjz5s" (UID: "99ca3e53-9ebc-464c-ac37-51163b9bc104") : configmap "openstack-scripts" not found Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.364502 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8m4g\" (UniqueName: \"kubernetes.io/projected/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-kube-api-access-t8m4g\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.364591 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.364608 4941 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.364621 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.364637 4941 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.364650 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.364662 4941 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.367233 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e6d72c12-422e-48fd-b56b-8344260e3e01" (UID: "e6d72c12-422e-48fd-b56b-8344260e3e01"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.375036 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "753c78f9-47e6-4098-91fa-9adac0997ba4" (UID: "753c78f9-47e6-4098-91fa-9adac0997ba4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.378199 4941 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.395632 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-config-data" (OuterVolumeSpecName: "config-data") pod "753c78f9-47e6-4098-91fa-9adac0997ba4" (UID: "753c78f9-47e6-4098-91fa-9adac0997ba4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.403055 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" (UID: "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.412951 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-config-data" (OuterVolumeSpecName: "config-data") pod "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" (UID: "927b9eb0-124f-4a2c-86ae-2ea4cbe609e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.413477 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-config-data" (OuterVolumeSpecName: "config-data") pod "e6d72c12-422e-48fd-b56b-8344260e3e01" (UID: "e6d72c12-422e-48fd-b56b-8344260e3e01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.415820 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "74c4b049-d672-41e8-b3cb-09b800f04a19" (UID: "74c4b049-d672-41e8-b3cb-09b800f04a19"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.466549 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.466572 4941 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.466581 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.466590 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.466600 4941 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.466608 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/753c78f9-47e6-4098-91fa-9adac0997ba4-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.466617 4941 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6d72c12-422e-48fd-b56b-8344260e3e01-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.466626 4941 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c4b049-d672-41e8-b3cb-09b800f04a19-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.492660 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-config-data" (OuterVolumeSpecName: "config-data") pod "ad2b6a75-839f-4fec-9f12-fb520b44c7ce" (UID: "ad2b6a75-839f-4fec-9f12-fb520b44c7ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.492678 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad2b6a75-839f-4fec-9f12-fb520b44c7ce" (UID: "ad2b6a75-839f-4fec-9f12-fb520b44c7ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.509566 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ad2b6a75-839f-4fec-9f12-fb520b44c7ce" (UID: "ad2b6a75-839f-4fec-9f12-fb520b44c7ce"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.558339 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ad2b6a75-839f-4fec-9f12-fb520b44c7ce" (UID: "ad2b6a75-839f-4fec-9f12-fb520b44c7ce"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.568310 4941 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.568344 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.568353 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.568361 4941 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2b6a75-839f-4fec-9f12-fb520b44c7ce-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.609593 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.612572 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.614842 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.614886 4941 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="d1ad12db-0b25-4e03-8772-de047be41b0d" containerName="ovn-northd" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.685658 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5598777fd7-9fgcl"] Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.689163 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.690546 4941 scope.go:117] "RemoveContainer" containerID="ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.694756 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5598777fd7-9fgcl"] Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.698432 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.710647 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.712610 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.714934 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.733992 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.754587 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.769868 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-config-data\") pod \"c892cbf7-126c-4638-854d-18cef63c7747\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.769907 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-ceilometer-tls-certs\") pod \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.769928 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-combined-ca-bundle\") pod \"757b037d-b7b8-4690-93b9-ec85c5bf82db\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.769945 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-config-data\") pod \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.769968 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-internal-tls-certs\") pod \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.769988 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-logs\") pod \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770023 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc5e0ad9-b4e5-4307-a381-3a92092a3240-log-httpd\") pod \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770067 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b22d449-d1ec-4bf4-a876-b86a87508580-combined-ca-bundle\") pod \"5b22d449-d1ec-4bf4-a876-b86a87508580\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770094 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-public-tls-certs\") pod \"757b037d-b7b8-4690-93b9-ec85c5bf82db\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770115 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzkch\" (UniqueName: \"kubernetes.io/projected/c892cbf7-126c-4638-854d-18cef63c7747-kube-api-access-wzkch\") pod \"c892cbf7-126c-4638-854d-18cef63c7747\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770140 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c892cbf7-126c-4638-854d-18cef63c7747-logs\") pod \"c892cbf7-126c-4638-854d-18cef63c7747\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770169 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt6wt\" (UniqueName: \"kubernetes.io/projected/5b22d449-d1ec-4bf4-a876-b86a87508580-kube-api-access-qt6wt\") pod \"5b22d449-d1ec-4bf4-a876-b86a87508580\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770192 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-public-tls-certs\") pod \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770208 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-nova-metadata-tls-certs\") pod \"c892cbf7-126c-4638-854d-18cef63c7747\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770228 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b22d449-d1ec-4bf4-a876-b86a87508580-config-data\") pod \"5b22d449-d1ec-4bf4-a876-b86a87508580\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770253 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-combined-ca-bundle\") pod \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770270 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-combined-ca-bundle\") pod \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770288 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-config-data\") pod \"757b037d-b7b8-4690-93b9-ec85c5bf82db\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770310 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-config-data-custom\") pod \"757b037d-b7b8-4690-93b9-ec85c5bf82db\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770338 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-sg-core-conf-yaml\") pod \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770361 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5b22d449-d1ec-4bf4-a876-b86a87508580-kolla-config\") pod \"5b22d449-d1ec-4bf4-a876-b86a87508580\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770377 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6k8n\" (UniqueName: \"kubernetes.io/projected/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-kube-api-access-s6k8n\") pod \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\" (UID: \"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770394 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntm2d\" (UniqueName: \"kubernetes.io/projected/757b037d-b7b8-4690-93b9-ec85c5bf82db-kube-api-access-ntm2d\") pod \"757b037d-b7b8-4690-93b9-ec85c5bf82db\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770432 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnh29\" (UniqueName: \"kubernetes.io/projected/fc5e0ad9-b4e5-4307-a381-3a92092a3240-kube-api-access-mnh29\") pod \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770449 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc5e0ad9-b4e5-4307-a381-3a92092a3240-run-httpd\") pod \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770473 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b22d449-d1ec-4bf4-a876-b86a87508580-memcached-tls-certs\") pod \"5b22d449-d1ec-4bf4-a876-b86a87508580\" (UID: \"5b22d449-d1ec-4bf4-a876-b86a87508580\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770499 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-scripts\") pod \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770523 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-combined-ca-bundle\") pod \"c892cbf7-126c-4638-854d-18cef63c7747\" (UID: \"c892cbf7-126c-4638-854d-18cef63c7747\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770574 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-config-data\") pod \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\" (UID: \"fc5e0ad9-b4e5-4307-a381-3a92092a3240\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770603 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/757b037d-b7b8-4690-93b9-ec85c5bf82db-logs\") pod \"757b037d-b7b8-4690-93b9-ec85c5bf82db\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.770623 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-internal-tls-certs\") pod \"757b037d-b7b8-4690-93b9-ec85c5bf82db\" (UID: \"757b037d-b7b8-4690-93b9-ec85c5bf82db\") " Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.779525 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5e0ad9-b4e5-4307-a381-3a92092a3240-kube-api-access-mnh29" (OuterVolumeSpecName: "kube-api-access-mnh29") pod "fc5e0ad9-b4e5-4307-a381-3a92092a3240" (UID: "fc5e0ad9-b4e5-4307-a381-3a92092a3240"). InnerVolumeSpecName "kube-api-access-mnh29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.779539 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.779965 4941 scope.go:117] "RemoveContainer" containerID="ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.782665 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c892cbf7-126c-4638-854d-18cef63c7747-kube-api-access-wzkch" (OuterVolumeSpecName: "kube-api-access-wzkch") pod "c892cbf7-126c-4638-854d-18cef63c7747" (UID: "c892cbf7-126c-4638-854d-18cef63c7747"). InnerVolumeSpecName "kube-api-access-wzkch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.787326 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd\": container with ID starting with ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd not found: ID does not exist" containerID="ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.787372 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd"} err="failed to get container status \"ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd\": rpc error: code = NotFound desc = could not find container \"ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd\": container with ID starting with ea59faa7fbaa612c5a14ba8827328d12434f04b728a94cf165fed963937bc9cd not found: ID does not exist" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.787396 4941 scope.go:117] "RemoveContainer" containerID="1a1116177ff225faade78191d5e77b34a374f65348259ab6e65f258ca8274368" Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.787626 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.791465 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.791533 4941 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovsdb-server" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.792385 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c892cbf7-126c-4638-854d-18cef63c7747-logs" (OuterVolumeSpecName: "logs") pod "c892cbf7-126c-4638-854d-18cef63c7747" (UID: "c892cbf7-126c-4638-854d-18cef63c7747"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.792503 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/757b037d-b7b8-4690-93b9-ec85c5bf82db-logs" (OuterVolumeSpecName: "logs") pod "757b037d-b7b8-4690-93b9-ec85c5bf82db" (UID: "757b037d-b7b8-4690-93b9-ec85c5bf82db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.793936 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b22d449-d1ec-4bf4-a876-b86a87508580-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "5b22d449-d1ec-4bf4-a876-b86a87508580" (UID: "5b22d449-d1ec-4bf4-a876-b86a87508580"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.794040 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b22d449-d1ec-4bf4-a876-b86a87508580-config-data" (OuterVolumeSpecName: "config-data") pod "5b22d449-d1ec-4bf4-a876-b86a87508580" (UID: "5b22d449-d1ec-4bf4-a876-b86a87508580"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.794435 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc5e0ad9-b4e5-4307-a381-3a92092a3240-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fc5e0ad9-b4e5-4307-a381-3a92092a3240" (UID: "fc5e0ad9-b4e5-4307-a381-3a92092a3240"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.794818 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-logs" (OuterVolumeSpecName: "logs") pod "a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" (UID: "a0b4aa9b-9dc9-4e99-87e8-6320c5a81456"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.797319 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc5e0ad9-b4e5-4307-a381-3a92092a3240-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fc5e0ad9-b4e5-4307-a381-3a92092a3240" (UID: "fc5e0ad9-b4e5-4307-a381-3a92092a3240"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.801689 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-scripts" (OuterVolumeSpecName: "scripts") pod "fc5e0ad9-b4e5-4307-a381-3a92092a3240" (UID: "fc5e0ad9-b4e5-4307-a381-3a92092a3240"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.812584 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b22d449-d1ec-4bf4-a876-b86a87508580-kube-api-access-qt6wt" (OuterVolumeSpecName: "kube-api-access-qt6wt") pod "5b22d449-d1ec-4bf4-a876-b86a87508580" (UID: "5b22d449-d1ec-4bf4-a876-b86a87508580"). InnerVolumeSpecName "kube-api-access-qt6wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.812616 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"927b9eb0-124f-4a2c-86ae-2ea4cbe609e7","Type":"ContainerDied","Data":"7090c3e128fde91da862c0961652bb0c7d813f6b4403bdd78912771666681ca1"} Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.812660 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.812787 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.818292 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.823268 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757b037d-b7b8-4690-93b9-ec85c5bf82db-kube-api-access-ntm2d" (OuterVolumeSpecName: "kube-api-access-ntm2d") pod "757b037d-b7b8-4690-93b9-ec85c5bf82db" (UID: "757b037d-b7b8-4690-93b9-ec85c5bf82db"). InnerVolumeSpecName "kube-api-access-ntm2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.849861 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "757b037d-b7b8-4690-93b9-ec85c5bf82db" (UID: "757b037d-b7b8-4690-93b9-ec85c5bf82db"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.849995 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:15 crc kubenswrapper[4941]: E0307 07:17:15.850056 4941 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovs-vswitchd" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.863662 4941 scope.go:117] "RemoveContainer" containerID="d6c2f62c9103f19083c550098257ae768e69623eab5370a23a6b39d03261c98b" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.887619 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b22d449-d1ec-4bf4-a876-b86a87508580-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.887648 4941 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.887662 4941 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5b22d449-d1ec-4bf4-a876-b86a87508580-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.887676 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntm2d\" (UniqueName: \"kubernetes.io/projected/757b037d-b7b8-4690-93b9-ec85c5bf82db-kube-api-access-ntm2d\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.887691 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnh29\" (UniqueName: \"kubernetes.io/projected/fc5e0ad9-b4e5-4307-a381-3a92092a3240-kube-api-access-mnh29\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.887704 4941 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc5e0ad9-b4e5-4307-a381-3a92092a3240-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.887717 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.887727 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/757b037d-b7b8-4690-93b9-ec85c5bf82db-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.887737 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.887746 4941 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc5e0ad9-b4e5-4307-a381-3a92092a3240-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.887754 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzkch\" (UniqueName: \"kubernetes.io/projected/c892cbf7-126c-4638-854d-18cef63c7747-kube-api-access-wzkch\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.887761 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c892cbf7-126c-4638-854d-18cef63c7747-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.887769 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt6wt\" (UniqueName: \"kubernetes.io/projected/5b22d449-d1ec-4bf4-a876-b86a87508580-kube-api-access-qt6wt\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.895924 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-kube-api-access-s6k8n" (OuterVolumeSpecName: "kube-api-access-s6k8n") pod "a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" (UID: "a0b4aa9b-9dc9-4e99-87e8-6320c5a81456"). InnerVolumeSpecName "kube-api-access-s6k8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.910591 4941 generic.go:334] "Generic (PLEG): container finished" podID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerID="24d5c64686bdb3b51a90c78d352242e68fa608bef152f77db898dddb78032f3a" exitCode=0 Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.910691 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc5e0ad9-b4e5-4307-a381-3a92092a3240","Type":"ContainerDied","Data":"24d5c64686bdb3b51a90c78d352242e68fa608bef152f77db898dddb78032f3a"} Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.910719 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc5e0ad9-b4e5-4307-a381-3a92092a3240","Type":"ContainerDied","Data":"48df85e9c8bbe3dacf4a6e9fa2e3e50848014284023c8e98a3b7426f16cf4280"} Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.910799 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.945570 4941 scope.go:117] "RemoveContainer" containerID="4b79935f1592e509f7cf336d7d435662bd63d7104ffb2d6d1825034c8387696c" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.960575 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "757b037d-b7b8-4690-93b9-ec85c5bf82db" (UID: "757b037d-b7b8-4690-93b9-ec85c5bf82db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.982230 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b22d449-d1ec-4bf4-a876-b86a87508580-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b22d449-d1ec-4bf4-a876-b86a87508580" (UID: "5b22d449-d1ec-4bf4-a876-b86a87508580"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.993427 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.993462 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b22d449-d1ec-4bf4-a876-b86a87508580-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:15 crc kubenswrapper[4941]: I0307 07:17:15.993472 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6k8n\" (UniqueName: \"kubernetes.io/projected/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-kube-api-access-s6k8n\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.007326 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-config-data" (OuterVolumeSpecName: "config-data") pod "c892cbf7-126c-4638-854d-18cef63c7747" (UID: "c892cbf7-126c-4638-854d-18cef63c7747"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.017070 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0491b032-0a65-4d6e-904e-b464a0acfcda" path="/var/lib/kubelet/pods/0491b032-0a65-4d6e-904e-b464a0acfcda/volumes" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.020761 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="056debcc-d271-4ea1-a70c-fc67794f060e" path="/var/lib/kubelet/pods/056debcc-d271-4ea1-a70c-fc67794f060e/volumes" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.021220 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38422d86-9fa3-4547-a810-106f783ac38a" path="/var/lib/kubelet/pods/38422d86-9fa3-4547-a810-106f783ac38a/volumes" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.021645 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-config-data" (OuterVolumeSpecName: "config-data") pod "a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" (UID: "a0b4aa9b-9dc9-4e99-87e8-6320c5a81456"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.021706 4941 generic.go:334] "Generic (PLEG): container finished" podID="a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" containerID="8fd831041666d61389790739f7bec118187287e501eddb0ae0421e7a9df1ec85" exitCode=0 Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.021820 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f5d8489-0104-4980-9f26-1330336ef7f0" path="/var/lib/kubelet/pods/4f5d8489-0104-4980-9f26-1330336ef7f0/volumes" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.021844 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.024807 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a1c3983-6c5e-48af-95cf-5f9536835f8d" path="/var/lib/kubelet/pods/6a1c3983-6c5e-48af-95cf-5f9536835f8d/volumes" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.025126 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79039479-0c9b-4931-8d9a-84271be3fee5" path="/var/lib/kubelet/pods/79039479-0c9b-4931-8d9a-84271be3fee5/volumes" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.025485 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" path="/var/lib/kubelet/pods/7b2f75a4-a46a-4430-bf4d-d3c2c65d8510/volumes" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.026088 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b306e38-c479-45ff-93ab-ca0e0e6a3aef" path="/var/lib/kubelet/pods/7b306e38-c479-45ff-93ab-ca0e0e6a3aef/volumes" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.027068 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8819def4-42df-4a8f-b5d0-21db1e1ca87a" path="/var/lib/kubelet/pods/8819def4-42df-4a8f-b5d0-21db1e1ca87a/volumes" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.041562 4941 scope.go:117] "RemoveContainer" containerID="dc55c3da137f47538c0bd3c217f221d45ba4d3f6188abc6966831c84aa69682d" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.061531 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.084519 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456","Type":"ContainerDied","Data":"8fd831041666d61389790739f7bec118187287e501eddb0ae0421e7a9df1ec85"} Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.084580 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.084599 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.084614 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0b4aa9b-9dc9-4e99-87e8-6320c5a81456","Type":"ContainerDied","Data":"7d06f9c8e232fe9e1561f48bec2c8f98d10b80ac8b6c45382a1fe11a80ade1c0"} Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.084627 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"74c4b049-d672-41e8-b3cb-09b800f04a19","Type":"ContainerDied","Data":"30521c90e4212432949f623b8c8a6e0634a9ca40ea2fa873605d323c704b886f"} Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.094629 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.095896 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.103512 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" (UID: "a0b4aa9b-9dc9-4e99-87e8-6320c5a81456"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.111581 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-685ff95674-ldzd4" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.112335 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-685ff95674-ldzd4" event={"ID":"ad2b6a75-839f-4fec-9f12-fb520b44c7ce","Type":"ContainerDied","Data":"06f78592f5289b476b99770c794d25dc413e7266988baee70a28e8e01616b6b7"} Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.114108 4941 generic.go:334] "Generic (PLEG): container finished" podID="5b22d449-d1ec-4bf4-a876-b86a87508580" containerID="db1a9f99b81bb6ecb54ee4a7546b073acc7305f1c4b4f070a77885466d64e5f6" exitCode=0 Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.114164 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5b22d449-d1ec-4bf4-a876-b86a87508580","Type":"ContainerDied","Data":"db1a9f99b81bb6ecb54ee4a7546b073acc7305f1c4b4f070a77885466d64e5f6"} Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.114179 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5b22d449-d1ec-4bf4-a876-b86a87508580","Type":"ContainerDied","Data":"cf6eeac298dd15c7159f49dece5dd7887b60604bebe300e1521ba43743b01426"} Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.114230 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.115795 4941 generic.go:334] "Generic (PLEG): container finished" podID="c892cbf7-126c-4638-854d-18cef63c7747" containerID="dcc3d4c395eb430f79b4594643474a932cbc5f6574ab24c67ad701a47825619a" exitCode=0 Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.115840 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c892cbf7-126c-4638-854d-18cef63c7747","Type":"ContainerDied","Data":"dcc3d4c395eb430f79b4594643474a932cbc5f6574ab24c67ad701a47825619a"} Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.115858 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c892cbf7-126c-4638-854d-18cef63c7747","Type":"ContainerDied","Data":"2399780da12c932963c0a5020fc82674c1fb29a3cc71a2e1aafaa5fa7f5f1133"} Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.115908 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.122480 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" (UID: "a0b4aa9b-9dc9-4e99-87e8-6320c5a81456"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.124293 4941 scope.go:117] "RemoveContainer" containerID="cfcc8787654437566381b6642741214a6b8bd5d86a6e869ab2659ad125818269" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.130825 4941 generic.go:334] "Generic (PLEG): container finished" podID="757b037d-b7b8-4690-93b9-ec85c5bf82db" containerID="9602f92d7dcc10b2686d4e7085e3763f533c9e0a49800d5c37d92fe2fd53acf8" exitCode=0 Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.130934 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.132057 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54977b5b64-bxjq6" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.132208 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54977b5b64-bxjq6" event={"ID":"757b037d-b7b8-4690-93b9-ec85c5bf82db","Type":"ContainerDied","Data":"9602f92d7dcc10b2686d4e7085e3763f533c9e0a49800d5c37d92fe2fd53acf8"} Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.132233 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54977b5b64-bxjq6" event={"ID":"757b037d-b7b8-4690-93b9-ec85c5bf82db","Type":"ContainerDied","Data":"652e37556e4bca6529743cd5de087a63bc1a8c0e445000eab404d433d4401493"} Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.132366 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.132600 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-427c-account-create-update-f84ls" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.132644 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b99-account-create-update-mzq6w" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.132879 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-582b-account-create-update-j28r7" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.152702 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fc5e0ad9-b4e5-4307-a381-3a92092a3240" (UID: "fc5e0ad9-b4e5-4307-a381-3a92092a3240"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.161974 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "757b037d-b7b8-4690-93b9-ec85c5bf82db" (UID: "757b037d-b7b8-4690-93b9-ec85c5bf82db"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.162058 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c892cbf7-126c-4638-854d-18cef63c7747" (UID: "c892cbf7-126c-4638-854d-18cef63c7747"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.164487 4941 scope.go:117] "RemoveContainer" containerID="2c0681e10d442ad2f1f0fc4b809613e3cdacd8d007ff20ae0809c479b09094cb" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.176765 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-685ff95674-ldzd4"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.191514 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fc5e0ad9-b4e5-4307-a381-3a92092a3240" (UID: "fc5e0ad9-b4e5-4307-a381-3a92092a3240"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.196241 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-685ff95674-ldzd4"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.197156 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.197182 4941 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.197192 4941 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.197200 4941 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.197209 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.197217 4941 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.207434 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.216832 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.224518 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.259199 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.290475 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3b99-account-create-update-mzq6w"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.291898 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-config-data" (OuterVolumeSpecName: "config-data") pod "757b037d-b7b8-4690-93b9-ec85c5bf82db" (UID: "757b037d-b7b8-4690-93b9-ec85c5bf82db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.299482 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.300641 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c892cbf7-126c-4638-854d-18cef63c7747" (UID: "c892cbf7-126c-4638-854d-18cef63c7747"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.301465 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "757b037d-b7b8-4690-93b9-ec85c5bf82db" (UID: "757b037d-b7b8-4690-93b9-ec85c5bf82db"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.302973 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-3b99-account-create-update-mzq6w"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.309444 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" (UID: "a0b4aa9b-9dc9-4e99-87e8-6320c5a81456"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.312062 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc5e0ad9-b4e5-4307-a381-3a92092a3240" (UID: "fc5e0ad9-b4e5-4307-a381-3a92092a3240"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.325708 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-config-data" (OuterVolumeSpecName: "config-data") pod "fc5e0ad9-b4e5-4307-a381-3a92092a3240" (UID: "fc5e0ad9-b4e5-4307-a381-3a92092a3240"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.337662 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b22d449-d1ec-4bf4-a876-b86a87508580-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "5b22d449-d1ec-4bf4-a876-b86a87508580" (UID: "5b22d449-d1ec-4bf4-a876-b86a87508580"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.365601 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-582b-account-create-update-j28r7"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.373777 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-582b-account-create-update-j28r7"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.400769 4941 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b22d449-d1ec-4bf4-a876-b86a87508580-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.400814 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: E0307 07:17:16.400814 4941 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 07 07:17:16 crc kubenswrapper[4941]: E0307 07:17:16.400894 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99ca3e53-9ebc-464c-ac37-51163b9bc104-operator-scripts podName:99ca3e53-9ebc-464c-ac37-51163b9bc104 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:18.400875877 +0000 UTC m=+1535.353241332 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/99ca3e53-9ebc-464c-ac37-51163b9bc104-operator-scripts") pod "root-account-create-update-jjz5s" (UID: "99ca3e53-9ebc-464c-ac37-51163b9bc104") : configmap "openstack-scripts" not found Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.400823 4941 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.400926 4941 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/757b037d-b7b8-4690-93b9-ec85c5bf82db-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.400937 4941 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c892cbf7-126c-4638-854d-18cef63c7747-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.400946 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e0ad9-b4e5-4307-a381-3a92092a3240-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.566373 4941 scope.go:117] "RemoveContainer" containerID="885046a8a3875aba6340cf20fd6d156d9145acc04deb00a3dc3da2bb74d33aa4" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.587022 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-427c-account-create-update-f84ls"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.604273 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-427c-account-create-update-f84ls"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.614947 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: E0307 07:17:16.615875 4941 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 07 07:17:16 crc kubenswrapper[4941]: E0307 07:17:16.615919 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data podName:3963d293-d9e9-44b6-b0a5-b1532b4a0a31 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:24.615904615 +0000 UTC m=+1541.568270080 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data") pod "rabbitmq-server-0" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31") : configmap "rabbitmq-config-data" not found Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.621602 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.627018 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.631031 4941 scope.go:117] "RemoveContainer" containerID="b048a7900d858225d4830b842a8b3ed5f22a79a6146d7e6677222d62c98ef913" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.636078 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.643020 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54977b5b64-bxjq6"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.647435 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-54977b5b64-bxjq6"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.655135 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jjz5s" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.665371 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.674506 4941 scope.go:117] "RemoveContainer" containerID="7b499e320c5a54cf02f4961762e570d445ed619170c6bc4f7f796075d747290d" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.678533 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.689459 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.699653 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.704908 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.706006 4941 scope.go:117] "RemoveContainer" containerID="340bf7be6f7110cc16013f4c5fec8c41a77df854bfe6612ef7ea3b858a27fa5f" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.715000 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.717035 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62j4x\" (UniqueName: \"kubernetes.io/projected/99ca3e53-9ebc-464c-ac37-51163b9bc104-kube-api-access-62j4x\") pod \"99ca3e53-9ebc-464c-ac37-51163b9bc104\" (UID: \"99ca3e53-9ebc-464c-ac37-51163b9bc104\") " Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.717186 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99ca3e53-9ebc-464c-ac37-51163b9bc104-operator-scripts\") pod \"99ca3e53-9ebc-464c-ac37-51163b9bc104\" (UID: \"99ca3e53-9ebc-464c-ac37-51163b9bc104\") " Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.717988 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ca3e53-9ebc-464c-ac37-51163b9bc104-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "99ca3e53-9ebc-464c-ac37-51163b9bc104" (UID: "99ca3e53-9ebc-464c-ac37-51163b9bc104"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.726567 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ca3e53-9ebc-464c-ac37-51163b9bc104-kube-api-access-62j4x" (OuterVolumeSpecName: "kube-api-access-62j4x") pod "99ca3e53-9ebc-464c-ac37-51163b9bc104" (UID: "99ca3e53-9ebc-464c-ac37-51163b9bc104"). InnerVolumeSpecName "kube-api-access-62j4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.740307 4941 scope.go:117] "RemoveContainer" containerID="24d5c64686bdb3b51a90c78d352242e68fa608bef152f77db898dddb78032f3a" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.758713 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d1ad12db-0b25-4e03-8772-de047be41b0d/ovn-northd/0.log" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.758773 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.788102 4941 scope.go:117] "RemoveContainer" containerID="6ea28da4f040b551e0f722f361b7d4996299e44365e98a82b4392308db4b8494" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.814857 4941 scope.go:117] "RemoveContainer" containerID="7b499e320c5a54cf02f4961762e570d445ed619170c6bc4f7f796075d747290d" Mar 07 07:17:16 crc kubenswrapper[4941]: E0307 07:17:16.815222 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b499e320c5a54cf02f4961762e570d445ed619170c6bc4f7f796075d747290d\": container with ID starting with 7b499e320c5a54cf02f4961762e570d445ed619170c6bc4f7f796075d747290d not found: ID does not exist" containerID="7b499e320c5a54cf02f4961762e570d445ed619170c6bc4f7f796075d747290d" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.815261 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b499e320c5a54cf02f4961762e570d445ed619170c6bc4f7f796075d747290d"} err="failed to get container status \"7b499e320c5a54cf02f4961762e570d445ed619170c6bc4f7f796075d747290d\": rpc error: code = NotFound desc = could not find container \"7b499e320c5a54cf02f4961762e570d445ed619170c6bc4f7f796075d747290d\": container with ID starting with 7b499e320c5a54cf02f4961762e570d445ed619170c6bc4f7f796075d747290d not found: ID does not exist" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.815294 4941 scope.go:117] "RemoveContainer" containerID="340bf7be6f7110cc16013f4c5fec8c41a77df854bfe6612ef7ea3b858a27fa5f" Mar 07 07:17:16 crc kubenswrapper[4941]: E0307 07:17:16.815726 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"340bf7be6f7110cc16013f4c5fec8c41a77df854bfe6612ef7ea3b858a27fa5f\": container with ID starting with 340bf7be6f7110cc16013f4c5fec8c41a77df854bfe6612ef7ea3b858a27fa5f not found: ID does not exist" containerID="340bf7be6f7110cc16013f4c5fec8c41a77df854bfe6612ef7ea3b858a27fa5f" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.815752 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340bf7be6f7110cc16013f4c5fec8c41a77df854bfe6612ef7ea3b858a27fa5f"} err="failed to get container status \"340bf7be6f7110cc16013f4c5fec8c41a77df854bfe6612ef7ea3b858a27fa5f\": rpc error: code = NotFound desc = could not find container \"340bf7be6f7110cc16013f4c5fec8c41a77df854bfe6612ef7ea3b858a27fa5f\": container with ID starting with 340bf7be6f7110cc16013f4c5fec8c41a77df854bfe6612ef7ea3b858a27fa5f not found: ID does not exist" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.815767 4941 scope.go:117] "RemoveContainer" containerID="24d5c64686bdb3b51a90c78d352242e68fa608bef152f77db898dddb78032f3a" Mar 07 07:17:16 crc kubenswrapper[4941]: E0307 07:17:16.815993 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d5c64686bdb3b51a90c78d352242e68fa608bef152f77db898dddb78032f3a\": container with ID starting with 24d5c64686bdb3b51a90c78d352242e68fa608bef152f77db898dddb78032f3a not found: ID does not exist" containerID="24d5c64686bdb3b51a90c78d352242e68fa608bef152f77db898dddb78032f3a" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.816015 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d5c64686bdb3b51a90c78d352242e68fa608bef152f77db898dddb78032f3a"} err="failed to get container status \"24d5c64686bdb3b51a90c78d352242e68fa608bef152f77db898dddb78032f3a\": rpc error: code = NotFound desc = could not find container \"24d5c64686bdb3b51a90c78d352242e68fa608bef152f77db898dddb78032f3a\": container with ID starting with 24d5c64686bdb3b51a90c78d352242e68fa608bef152f77db898dddb78032f3a not found: ID does not exist" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.816029 4941 scope.go:117] "RemoveContainer" containerID="6ea28da4f040b551e0f722f361b7d4996299e44365e98a82b4392308db4b8494" Mar 07 07:17:16 crc kubenswrapper[4941]: E0307 07:17:16.818474 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea28da4f040b551e0f722f361b7d4996299e44365e98a82b4392308db4b8494\": container with ID starting with 6ea28da4f040b551e0f722f361b7d4996299e44365e98a82b4392308db4b8494 not found: ID does not exist" containerID="6ea28da4f040b551e0f722f361b7d4996299e44365e98a82b4392308db4b8494" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.818513 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea28da4f040b551e0f722f361b7d4996299e44365e98a82b4392308db4b8494"} err="failed to get container status \"6ea28da4f040b551e0f722f361b7d4996299e44365e98a82b4392308db4b8494\": rpc error: code = NotFound desc = could not find container \"6ea28da4f040b551e0f722f361b7d4996299e44365e98a82b4392308db4b8494\": container with ID starting with 6ea28da4f040b551e0f722f361b7d4996299e44365e98a82b4392308db4b8494 not found: ID does not exist" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.818526 4941 scope.go:117] "RemoveContainer" containerID="8fd831041666d61389790739f7bec118187287e501eddb0ae0421e7a9df1ec85" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.818640 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-ovn-northd-tls-certs\") pod \"d1ad12db-0b25-4e03-8772-de047be41b0d\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.818693 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pkpg\" (UniqueName: \"kubernetes.io/projected/d1ad12db-0b25-4e03-8772-de047be41b0d-kube-api-access-4pkpg\") pod \"d1ad12db-0b25-4e03-8772-de047be41b0d\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.818733 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1ad12db-0b25-4e03-8772-de047be41b0d-ovn-rundir\") pod \"d1ad12db-0b25-4e03-8772-de047be41b0d\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.818788 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ad12db-0b25-4e03-8772-de047be41b0d-config\") pod \"d1ad12db-0b25-4e03-8772-de047be41b0d\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.818856 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-metrics-certs-tls-certs\") pod \"d1ad12db-0b25-4e03-8772-de047be41b0d\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.818875 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1ad12db-0b25-4e03-8772-de047be41b0d-scripts\") pod \"d1ad12db-0b25-4e03-8772-de047be41b0d\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.818912 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-combined-ca-bundle\") pod \"d1ad12db-0b25-4e03-8772-de047be41b0d\" (UID: \"d1ad12db-0b25-4e03-8772-de047be41b0d\") " Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.819312 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62j4x\" (UniqueName: \"kubernetes.io/projected/99ca3e53-9ebc-464c-ac37-51163b9bc104-kube-api-access-62j4x\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.819327 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99ca3e53-9ebc-464c-ac37-51163b9bc104-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.819553 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1ad12db-0b25-4e03-8772-de047be41b0d-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "d1ad12db-0b25-4e03-8772-de047be41b0d" (UID: "d1ad12db-0b25-4e03-8772-de047be41b0d"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.820840 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1ad12db-0b25-4e03-8772-de047be41b0d-scripts" (OuterVolumeSpecName: "scripts") pod "d1ad12db-0b25-4e03-8772-de047be41b0d" (UID: "d1ad12db-0b25-4e03-8772-de047be41b0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.821178 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1ad12db-0b25-4e03-8772-de047be41b0d-config" (OuterVolumeSpecName: "config") pod "d1ad12db-0b25-4e03-8772-de047be41b0d" (UID: "d1ad12db-0b25-4e03-8772-de047be41b0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.823991 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1ad12db-0b25-4e03-8772-de047be41b0d-kube-api-access-4pkpg" (OuterVolumeSpecName: "kube-api-access-4pkpg") pod "d1ad12db-0b25-4e03-8772-de047be41b0d" (UID: "d1ad12db-0b25-4e03-8772-de047be41b0d"). InnerVolumeSpecName "kube-api-access-4pkpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.847140 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1ad12db-0b25-4e03-8772-de047be41b0d" (UID: "d1ad12db-0b25-4e03-8772-de047be41b0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.854944 4941 scope.go:117] "RemoveContainer" containerID="e0bc59efb00016de1cf1533c35c568041ed4eee0c8728d1661b2d85f7921e10f" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.883663 4941 scope.go:117] "RemoveContainer" containerID="8fd831041666d61389790739f7bec118187287e501eddb0ae0421e7a9df1ec85" Mar 07 07:17:16 crc kubenswrapper[4941]: E0307 07:17:16.887382 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fd831041666d61389790739f7bec118187287e501eddb0ae0421e7a9df1ec85\": container with ID starting with 8fd831041666d61389790739f7bec118187287e501eddb0ae0421e7a9df1ec85 not found: ID does not exist" containerID="8fd831041666d61389790739f7bec118187287e501eddb0ae0421e7a9df1ec85" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.887427 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fd831041666d61389790739f7bec118187287e501eddb0ae0421e7a9df1ec85"} err="failed to get container status \"8fd831041666d61389790739f7bec118187287e501eddb0ae0421e7a9df1ec85\": rpc error: code = NotFound desc = could not find container \"8fd831041666d61389790739f7bec118187287e501eddb0ae0421e7a9df1ec85\": container with ID starting with 8fd831041666d61389790739f7bec118187287e501eddb0ae0421e7a9df1ec85 not found: ID does not exist" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.887452 4941 scope.go:117] "RemoveContainer" containerID="e0bc59efb00016de1cf1533c35c568041ed4eee0c8728d1661b2d85f7921e10f" Mar 07 07:17:16 crc kubenswrapper[4941]: E0307 07:17:16.888911 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0bc59efb00016de1cf1533c35c568041ed4eee0c8728d1661b2d85f7921e10f\": container with ID starting with e0bc59efb00016de1cf1533c35c568041ed4eee0c8728d1661b2d85f7921e10f not found: ID does not exist" containerID="e0bc59efb00016de1cf1533c35c568041ed4eee0c8728d1661b2d85f7921e10f" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.888945 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bc59efb00016de1cf1533c35c568041ed4eee0c8728d1661b2d85f7921e10f"} err="failed to get container status \"e0bc59efb00016de1cf1533c35c568041ed4eee0c8728d1661b2d85f7921e10f\": rpc error: code = NotFound desc = could not find container \"e0bc59efb00016de1cf1533c35c568041ed4eee0c8728d1661b2d85f7921e10f\": container with ID starting with e0bc59efb00016de1cf1533c35c568041ed4eee0c8728d1661b2d85f7921e10f not found: ID does not exist" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.888960 4941 scope.go:117] "RemoveContainer" containerID="d0ced7486dfed1220f94f3b911f9652e0c4769872e3285a1108e1beb5bef597b" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.903490 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d1ad12db-0b25-4e03-8772-de047be41b0d" (UID: "d1ad12db-0b25-4e03-8772-de047be41b0d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.909631 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.912788 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "d1ad12db-0b25-4e03-8772-de047be41b0d" (UID: "d1ad12db-0b25-4e03-8772-de047be41b0d"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.917367 4941 scope.go:117] "RemoveContainer" containerID="fb6516769d261d733fc9be130e2e6aea292a7c5ff2e94299bbabd569bbe859a7" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.920792 4941 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.920812 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pkpg\" (UniqueName: \"kubernetes.io/projected/d1ad12db-0b25-4e03-8772-de047be41b0d-kube-api-access-4pkpg\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.920822 4941 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1ad12db-0b25-4e03-8772-de047be41b0d-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.920831 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ad12db-0b25-4e03-8772-de047be41b0d-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.920841 4941 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.920851 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1ad12db-0b25-4e03-8772-de047be41b0d-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.920858 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ad12db-0b25-4e03-8772-de047be41b0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.954547 4941 scope.go:117] "RemoveContainer" containerID="f8fe04edd08619bd4118ac72769de11cb1dc8824ae2dfa0799d60d9d7cab0731" Mar 07 07:17:16 crc kubenswrapper[4941]: I0307 07:17:16.975012 4941 scope.go:117] "RemoveContainer" containerID="db1a9f99b81bb6ecb54ee4a7546b073acc7305f1c4b4f070a77885466d64e5f6" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.000219 4941 scope.go:117] "RemoveContainer" containerID="db1a9f99b81bb6ecb54ee4a7546b073acc7305f1c4b4f070a77885466d64e5f6" Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.000804 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db1a9f99b81bb6ecb54ee4a7546b073acc7305f1c4b4f070a77885466d64e5f6\": container with ID starting with db1a9f99b81bb6ecb54ee4a7546b073acc7305f1c4b4f070a77885466d64e5f6 not found: ID does not exist" containerID="db1a9f99b81bb6ecb54ee4a7546b073acc7305f1c4b4f070a77885466d64e5f6" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.000835 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db1a9f99b81bb6ecb54ee4a7546b073acc7305f1c4b4f070a77885466d64e5f6"} err="failed to get container status \"db1a9f99b81bb6ecb54ee4a7546b073acc7305f1c4b4f070a77885466d64e5f6\": rpc error: code = NotFound desc = could not find container \"db1a9f99b81bb6ecb54ee4a7546b073acc7305f1c4b4f070a77885466d64e5f6\": container with ID starting with db1a9f99b81bb6ecb54ee4a7546b073acc7305f1c4b4f070a77885466d64e5f6 not found: ID does not exist" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.000855 4941 scope.go:117] "RemoveContainer" containerID="dcc3d4c395eb430f79b4594643474a932cbc5f6574ab24c67ad701a47825619a" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.021119 4941 scope.go:117] "RemoveContainer" containerID="bbeb9946a442f98e015fb840554d04da471185075dcd4f2981db1c33d0175b7d" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.021665 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-kolla-config\") pod \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.021750 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b1fb4667-396e-44bb-a2ed-e576a9b69be2-config-data-generated\") pod \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.021789 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1fb4667-396e-44bb-a2ed-e576a9b69be2-combined-ca-bundle\") pod \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.021815 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-operator-scripts\") pod \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.021846 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1fb4667-396e-44bb-a2ed-e576a9b69be2-galera-tls-certs\") pod \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.021871 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjnw5\" (UniqueName: \"kubernetes.io/projected/b1fb4667-396e-44bb-a2ed-e576a9b69be2-kube-api-access-jjnw5\") pod \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.021897 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.021923 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-config-data-default\") pod \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\" (UID: \"b1fb4667-396e-44bb-a2ed-e576a9b69be2\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.022347 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b1fb4667-396e-44bb-a2ed-e576a9b69be2" (UID: "b1fb4667-396e-44bb-a2ed-e576a9b69be2"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.022833 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1fb4667-396e-44bb-a2ed-e576a9b69be2-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "b1fb4667-396e-44bb-a2ed-e576a9b69be2" (UID: "b1fb4667-396e-44bb-a2ed-e576a9b69be2"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.022843 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "b1fb4667-396e-44bb-a2ed-e576a9b69be2" (UID: "b1fb4667-396e-44bb-a2ed-e576a9b69be2"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.022947 4941 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.022996 4941 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data podName:aeb1dd04-5b8c-49b4-bf65-be38fb8ae670 nodeName:}" failed. No retries permitted until 2026-03-07 07:17:25.022978518 +0000 UTC m=+1541.975343983 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data") pod "rabbitmq-cell1-server-0" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670") : configmap "rabbitmq-cell1-config-data" not found Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.023022 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1fb4667-396e-44bb-a2ed-e576a9b69be2" (UID: "b1fb4667-396e-44bb-a2ed-e576a9b69be2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.032032 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1fb4667-396e-44bb-a2ed-e576a9b69be2-kube-api-access-jjnw5" (OuterVolumeSpecName: "kube-api-access-jjnw5") pod "b1fb4667-396e-44bb-a2ed-e576a9b69be2" (UID: "b1fb4667-396e-44bb-a2ed-e576a9b69be2"). InnerVolumeSpecName "kube-api-access-jjnw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.038260 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "b1fb4667-396e-44bb-a2ed-e576a9b69be2" (UID: "b1fb4667-396e-44bb-a2ed-e576a9b69be2"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.051753 4941 scope.go:117] "RemoveContainer" containerID="dcc3d4c395eb430f79b4594643474a932cbc5f6574ab24c67ad701a47825619a" Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.052302 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc3d4c395eb430f79b4594643474a932cbc5f6574ab24c67ad701a47825619a\": container with ID starting with dcc3d4c395eb430f79b4594643474a932cbc5f6574ab24c67ad701a47825619a not found: ID does not exist" containerID="dcc3d4c395eb430f79b4594643474a932cbc5f6574ab24c67ad701a47825619a" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.052346 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc3d4c395eb430f79b4594643474a932cbc5f6574ab24c67ad701a47825619a"} err="failed to get container status \"dcc3d4c395eb430f79b4594643474a932cbc5f6574ab24c67ad701a47825619a\": rpc error: code = NotFound desc = could not find container \"dcc3d4c395eb430f79b4594643474a932cbc5f6574ab24c67ad701a47825619a\": container with ID starting with dcc3d4c395eb430f79b4594643474a932cbc5f6574ab24c67ad701a47825619a not found: ID does not exist" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.052374 4941 scope.go:117] "RemoveContainer" containerID="bbeb9946a442f98e015fb840554d04da471185075dcd4f2981db1c33d0175b7d" Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.052678 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbeb9946a442f98e015fb840554d04da471185075dcd4f2981db1c33d0175b7d\": container with ID starting with bbeb9946a442f98e015fb840554d04da471185075dcd4f2981db1c33d0175b7d not found: ID does not exist" containerID="bbeb9946a442f98e015fb840554d04da471185075dcd4f2981db1c33d0175b7d" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.052718 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbeb9946a442f98e015fb840554d04da471185075dcd4f2981db1c33d0175b7d"} err="failed to get container status \"bbeb9946a442f98e015fb840554d04da471185075dcd4f2981db1c33d0175b7d\": rpc error: code = NotFound desc = could not find container \"bbeb9946a442f98e015fb840554d04da471185075dcd4f2981db1c33d0175b7d\": container with ID starting with bbeb9946a442f98e015fb840554d04da471185075dcd4f2981db1c33d0175b7d not found: ID does not exist" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.052744 4941 scope.go:117] "RemoveContainer" containerID="9602f92d7dcc10b2686d4e7085e3763f533c9e0a49800d5c37d92fe2fd53acf8" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.054033 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1fb4667-396e-44bb-a2ed-e576a9b69be2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1fb4667-396e-44bb-a2ed-e576a9b69be2" (UID: "b1fb4667-396e-44bb-a2ed-e576a9b69be2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.071089 4941 scope.go:117] "RemoveContainer" containerID="68a7bfdd9a6a2bbbe4e0f5d3c955d8f816e0ef8b87e76bd424b8caeec2a73bc3" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.072144 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1fb4667-396e-44bb-a2ed-e576a9b69be2-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "b1fb4667-396e-44bb-a2ed-e576a9b69be2" (UID: "b1fb4667-396e-44bb-a2ed-e576a9b69be2"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.096722 4941 scope.go:117] "RemoveContainer" containerID="9602f92d7dcc10b2686d4e7085e3763f533c9e0a49800d5c37d92fe2fd53acf8" Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.097906 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9602f92d7dcc10b2686d4e7085e3763f533c9e0a49800d5c37d92fe2fd53acf8\": container with ID starting with 9602f92d7dcc10b2686d4e7085e3763f533c9e0a49800d5c37d92fe2fd53acf8 not found: ID does not exist" containerID="9602f92d7dcc10b2686d4e7085e3763f533c9e0a49800d5c37d92fe2fd53acf8" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.097937 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9602f92d7dcc10b2686d4e7085e3763f533c9e0a49800d5c37d92fe2fd53acf8"} err="failed to get container status \"9602f92d7dcc10b2686d4e7085e3763f533c9e0a49800d5c37d92fe2fd53acf8\": rpc error: code = NotFound desc = could not find container \"9602f92d7dcc10b2686d4e7085e3763f533c9e0a49800d5c37d92fe2fd53acf8\": container with ID starting with 9602f92d7dcc10b2686d4e7085e3763f533c9e0a49800d5c37d92fe2fd53acf8 not found: ID does not exist" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.097958 4941 scope.go:117] "RemoveContainer" containerID="68a7bfdd9a6a2bbbe4e0f5d3c955d8f816e0ef8b87e76bd424b8caeec2a73bc3" Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.098269 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68a7bfdd9a6a2bbbe4e0f5d3c955d8f816e0ef8b87e76bd424b8caeec2a73bc3\": container with ID starting with 68a7bfdd9a6a2bbbe4e0f5d3c955d8f816e0ef8b87e76bd424b8caeec2a73bc3 not found: ID does not exist" containerID="68a7bfdd9a6a2bbbe4e0f5d3c955d8f816e0ef8b87e76bd424b8caeec2a73bc3" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.098290 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a7bfdd9a6a2bbbe4e0f5d3c955d8f816e0ef8b87e76bd424b8caeec2a73bc3"} err="failed to get container status \"68a7bfdd9a6a2bbbe4e0f5d3c955d8f816e0ef8b87e76bd424b8caeec2a73bc3\": rpc error: code = NotFound desc = could not find container \"68a7bfdd9a6a2bbbe4e0f5d3c955d8f816e0ef8b87e76bd424b8caeec2a73bc3\": container with ID starting with 68a7bfdd9a6a2bbbe4e0f5d3c955d8f816e0ef8b87e76bd424b8caeec2a73bc3 not found: ID does not exist" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.124091 4941 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.124133 4941 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.124147 4941 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.124158 4941 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b1fb4667-396e-44bb-a2ed-e576a9b69be2-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.124166 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1fb4667-396e-44bb-a2ed-e576a9b69be2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.124176 4941 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1fb4667-396e-44bb-a2ed-e576a9b69be2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.124187 4941 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1fb4667-396e-44bb-a2ed-e576a9b69be2-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.124198 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjnw5\" (UniqueName: \"kubernetes.io/projected/b1fb4667-396e-44bb-a2ed-e576a9b69be2-kube-api-access-jjnw5\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.149225 4941 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.153888 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jjz5s" event={"ID":"99ca3e53-9ebc-464c-ac37-51163b9bc104","Type":"ContainerDied","Data":"6a52803a97ef3a4bbc312cab4b76fa6a183fd74f73e38061dfe093201b1abb7f"} Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.153904 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jjz5s" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.153934 4941 scope.go:117] "RemoveContainer" containerID="0b519bc4136cc105e727716b8acfe9c39c1115c2fb1a9245e1ee243a155acf47" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.171115 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d1ad12db-0b25-4e03-8772-de047be41b0d/ovn-northd/0.log" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.171157 4941 generic.go:334] "Generic (PLEG): container finished" podID="d1ad12db-0b25-4e03-8772-de047be41b0d" containerID="f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e" exitCode=139 Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.171240 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.171579 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d1ad12db-0b25-4e03-8772-de047be41b0d","Type":"ContainerDied","Data":"f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e"} Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.171628 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d1ad12db-0b25-4e03-8772-de047be41b0d","Type":"ContainerDied","Data":"d6a720f08f8f966945eb7d616110127ed462090089cba4b66d921d17f0f339ee"} Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.178569 4941 generic.go:334] "Generic (PLEG): container finished" podID="b1fb4667-396e-44bb-a2ed-e576a9b69be2" containerID="7f685a4108caaa4e36108f8a55c285a49cde2d7ba1b6c62533d0207627b03aa3" exitCode=0 Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.178720 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b1fb4667-396e-44bb-a2ed-e576a9b69be2","Type":"ContainerDied","Data":"7f685a4108caaa4e36108f8a55c285a49cde2d7ba1b6c62533d0207627b03aa3"} Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.178807 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b1fb4667-396e-44bb-a2ed-e576a9b69be2","Type":"ContainerDied","Data":"18d307fa7f9fdee1b297c14a672ca1ecc03942f9702524fffd3029aac8976ab3"} Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.178965 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.225760 4941 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.251373 4941 scope.go:117] "RemoveContainer" containerID="98b33015c03bac854741c1b04cb6494dce5e0047189adcafb6d92d466315ec75" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.279979 4941 scope.go:117] "RemoveContainer" containerID="f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.299428 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jjz5s"] Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.313899 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jjz5s"] Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.324220 4941 scope.go:117] "RemoveContainer" containerID="98b33015c03bac854741c1b04cb6494dce5e0047189adcafb6d92d466315ec75" Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.325437 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98b33015c03bac854741c1b04cb6494dce5e0047189adcafb6d92d466315ec75\": container with ID starting with 98b33015c03bac854741c1b04cb6494dce5e0047189adcafb6d92d466315ec75 not found: ID does not exist" containerID="98b33015c03bac854741c1b04cb6494dce5e0047189adcafb6d92d466315ec75" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.325464 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b33015c03bac854741c1b04cb6494dce5e0047189adcafb6d92d466315ec75"} err="failed to get container status \"98b33015c03bac854741c1b04cb6494dce5e0047189adcafb6d92d466315ec75\": rpc error: code = NotFound desc = could not find container \"98b33015c03bac854741c1b04cb6494dce5e0047189adcafb6d92d466315ec75\": container with ID starting with 98b33015c03bac854741c1b04cb6494dce5e0047189adcafb6d92d466315ec75 not found: ID does not exist" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.325484 4941 scope.go:117] "RemoveContainer" containerID="f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e" Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.325820 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e\": container with ID starting with f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e not found: ID does not exist" containerID="f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.325836 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e"} err="failed to get container status \"f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e\": rpc error: code = NotFound desc = could not find container \"f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e\": container with ID starting with f37e36677dfdd94d46b4fa336e3e0e7f62b384ee293d7ba81017340152c8ee1e not found: ID does not exist" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.325853 4941 scope.go:117] "RemoveContainer" containerID="7f685a4108caaa4e36108f8a55c285a49cde2d7ba1b6c62533d0207627b03aa3" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.339154 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.348776 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.353495 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.357016 4941 scope.go:117] "RemoveContainer" containerID="0a897e9d6eb2dfe5274f7722bda82f3f429420fd16b4565875bd08cbdfefaab3" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.363461 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.385079 4941 scope.go:117] "RemoveContainer" containerID="7f685a4108caaa4e36108f8a55c285a49cde2d7ba1b6c62533d0207627b03aa3" Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.385961 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f685a4108caaa4e36108f8a55c285a49cde2d7ba1b6c62533d0207627b03aa3\": container with ID starting with 7f685a4108caaa4e36108f8a55c285a49cde2d7ba1b6c62533d0207627b03aa3 not found: ID does not exist" containerID="7f685a4108caaa4e36108f8a55c285a49cde2d7ba1b6c62533d0207627b03aa3" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.386007 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f685a4108caaa4e36108f8a55c285a49cde2d7ba1b6c62533d0207627b03aa3"} err="failed to get container status \"7f685a4108caaa4e36108f8a55c285a49cde2d7ba1b6c62533d0207627b03aa3\": rpc error: code = NotFound desc = could not find container \"7f685a4108caaa4e36108f8a55c285a49cde2d7ba1b6c62533d0207627b03aa3\": container with ID starting with 7f685a4108caaa4e36108f8a55c285a49cde2d7ba1b6c62533d0207627b03aa3 not found: ID does not exist" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.386038 4941 scope.go:117] "RemoveContainer" containerID="0a897e9d6eb2dfe5274f7722bda82f3f429420fd16b4565875bd08cbdfefaab3" Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.386458 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a897e9d6eb2dfe5274f7722bda82f3f429420fd16b4565875bd08cbdfefaab3\": container with ID starting with 0a897e9d6eb2dfe5274f7722bda82f3f429420fd16b4565875bd08cbdfefaab3 not found: ID does not exist" containerID="0a897e9d6eb2dfe5274f7722bda82f3f429420fd16b4565875bd08cbdfefaab3" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.386509 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a897e9d6eb2dfe5274f7722bda82f3f429420fd16b4565875bd08cbdfefaab3"} err="failed to get container status \"0a897e9d6eb2dfe5274f7722bda82f3f429420fd16b4565875bd08cbdfefaab3\": rpc error: code = NotFound desc = could not find container \"0a897e9d6eb2dfe5274f7722bda82f3f429420fd16b4565875bd08cbdfefaab3\": container with ID starting with 0a897e9d6eb2dfe5274f7722bda82f3f429420fd16b4565875bd08cbdfefaab3 not found: ID does not exist" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.697601 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.834508 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-config-data\") pod \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.834575 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-combined-ca-bundle\") pod \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.834598 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvvg6\" (UniqueName: \"kubernetes.io/projected/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-kube-api-access-vvvg6\") pod \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.834665 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-fernet-keys\") pod \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.834714 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-scripts\") pod \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.834745 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-credential-keys\") pod \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.834779 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-internal-tls-certs\") pod \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.834803 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-public-tls-certs\") pod \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\" (UID: \"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.835283 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.840235 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" (UID: "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.840636 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-kube-api-access-vvvg6" (OuterVolumeSpecName: "kube-api-access-vvvg6") pod "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" (UID: "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b"). InnerVolumeSpecName "kube-api-access-vvvg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.840640 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" (UID: "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.848224 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-scripts" (OuterVolumeSpecName: "scripts") pod "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" (UID: "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.893822 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-config-data" (OuterVolumeSpecName: "config-data") pod "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" (UID: "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.894434 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" (UID: "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.905372 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" (UID: "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.927832 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.929498 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.930377 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 07 07:17:17 crc kubenswrapper[4941]: E0307 07:17:17.930469 4941 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="943d63f3-758d-4884-8086-93defd44f58a" containerName="nova-cell1-conductor-conductor" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.935887 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pdqw\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-kube-api-access-7pdqw\") pod \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.935958 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-server-conf\") pod \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.935991 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-plugins\") pod \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936024 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936053 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-erlang-cookie\") pod \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936112 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-erlang-cookie-secret\") pod \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936158 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-plugins-conf\") pod \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936208 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-confd\") pod \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936231 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-tls\") pod \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936259 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-pod-info\") pod \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936335 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data\") pod \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\" (UID: \"3963d293-d9e9-44b6-b0a5-b1532b4a0a31\") " Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936781 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936799 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936814 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvvg6\" (UniqueName: \"kubernetes.io/projected/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-kube-api-access-vvvg6\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936825 4941 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936836 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936847 4941 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.936859 4941 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.940569 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-kube-api-access-7pdqw" (OuterVolumeSpecName: "kube-api-access-7pdqw") pod "3963d293-d9e9-44b6-b0a5-b1532b4a0a31" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31"). InnerVolumeSpecName "kube-api-access-7pdqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.941487 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" (UID: "b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.942037 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3963d293-d9e9-44b6-b0a5-b1532b4a0a31" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.942753 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3963d293-d9e9-44b6-b0a5-b1532b4a0a31" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.942814 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3963d293-d9e9-44b6-b0a5-b1532b4a0a31" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.943185 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3963d293-d9e9-44b6-b0a5-b1532b4a0a31" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.944949 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "3963d293-d9e9-44b6-b0a5-b1532b4a0a31" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.945488 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-pod-info" (OuterVolumeSpecName: "pod-info") pod "3963d293-d9e9-44b6-b0a5-b1532b4a0a31" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.945610 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3963d293-d9e9-44b6-b0a5-b1532b4a0a31" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.955744 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data" (OuterVolumeSpecName: "config-data") pod "3963d293-d9e9-44b6-b0a5-b1532b4a0a31" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.966328 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b22d449-d1ec-4bf4-a876-b86a87508580" path="/var/lib/kubelet/pods/5b22d449-d1ec-4bf4-a876-b86a87508580/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.966797 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d" path="/var/lib/kubelet/pods/6dfaf3cb-4756-48c1-9fb3-c2fc0d21689d/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.967105 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74c4b049-d672-41e8-b3cb-09b800f04a19" path="/var/lib/kubelet/pods/74c4b049-d672-41e8-b3cb-09b800f04a19/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.967789 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="753c78f9-47e6-4098-91fa-9adac0997ba4" path="/var/lib/kubelet/pods/753c78f9-47e6-4098-91fa-9adac0997ba4/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.968793 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-server-conf" (OuterVolumeSpecName: "server-conf") pod "3963d293-d9e9-44b6-b0a5-b1532b4a0a31" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.969037 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="757b037d-b7b8-4690-93b9-ec85c5bf82db" path="/var/lib/kubelet/pods/757b037d-b7b8-4690-93b9-ec85c5bf82db/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.969709 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" path="/var/lib/kubelet/pods/927b9eb0-124f-4a2c-86ae-2ea4cbe609e7/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.970822 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ca3e53-9ebc-464c-ac37-51163b9bc104" path="/var/lib/kubelet/pods/99ca3e53-9ebc-464c-ac37-51163b9bc104/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.971671 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" path="/var/lib/kubelet/pods/a0b4aa9b-9dc9-4e99-87e8-6320c5a81456/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.972253 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2b6a75-839f-4fec-9f12-fb520b44c7ce" path="/var/lib/kubelet/pods/ad2b6a75-839f-4fec-9f12-fb520b44c7ce/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.973227 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b055e9de-2e86-467d-9e93-8fd06977cc87" path="/var/lib/kubelet/pods/b055e9de-2e86-467d-9e93-8fd06977cc87/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.973812 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1fb4667-396e-44bb-a2ed-e576a9b69be2" path="/var/lib/kubelet/pods/b1fb4667-396e-44bb-a2ed-e576a9b69be2/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.974420 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c892cbf7-126c-4638-854d-18cef63c7747" path="/var/lib/kubelet/pods/c892cbf7-126c-4638-854d-18cef63c7747/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.975368 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1ad12db-0b25-4e03-8772-de047be41b0d" path="/var/lib/kubelet/pods/d1ad12db-0b25-4e03-8772-de047be41b0d/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.975948 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d72c12-422e-48fd-b56b-8344260e3e01" path="/var/lib/kubelet/pods/e6d72c12-422e-48fd-b56b-8344260e3e01/volumes" Mar 07 07:17:17 crc kubenswrapper[4941]: I0307 07:17:17.976979 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" path="/var/lib/kubelet/pods/fc5e0ad9-b4e5-4307-a381-3a92092a3240/volumes" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.029083 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3963d293-d9e9-44b6-b0a5-b1532b4a0a31" (UID: "3963d293-d9e9-44b6-b0a5-b1532b4a0a31"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.040559 4941 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.040588 4941 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.040600 4941 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.040608 4941 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-pod-info\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.040617 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.040625 4941 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.040634 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pdqw\" (UniqueName: \"kubernetes.io/projected/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-kube-api-access-7pdqw\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.040642 4941 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-server-conf\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.040650 4941 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.040673 4941 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.040683 4941 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.040691 4941 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3963d293-d9e9-44b6-b0a5-b1532b4a0a31-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.055197 4941 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.142114 4941 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.204507 4941 generic.go:334] "Generic (PLEG): container finished" podID="aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" containerID="f7a8e765543e88a1c6e7d28463ec6d1148163252cc8cc4989b9a46a6cdfd7693" exitCode=0 Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.204608 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670","Type":"ContainerDied","Data":"f7a8e765543e88a1c6e7d28463ec6d1148163252cc8cc4989b9a46a6cdfd7693"} Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.237319 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5c79966c-pthdw" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.237323 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5c79966c-pthdw" event={"ID":"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b","Type":"ContainerDied","Data":"b109f7f57212d8e8c114a9a627a5a319764e1da9f3d67fde616e4969dbfc95a1"} Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.237656 4941 scope.go:117] "RemoveContainer" containerID="b109f7f57212d8e8c114a9a627a5a319764e1da9f3d67fde616e4969dbfc95a1" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.237221 4941 generic.go:334] "Generic (PLEG): container finished" podID="b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" containerID="b109f7f57212d8e8c114a9a627a5a319764e1da9f3d67fde616e4969dbfc95a1" exitCode=0 Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.237750 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5c79966c-pthdw" event={"ID":"b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b","Type":"ContainerDied","Data":"a775c081a00ad28718e2adc3495a133d34974dbc35088e0179f8cb2b6dcb41b2"} Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.242185 4941 generic.go:334] "Generic (PLEG): container finished" podID="3963d293-d9e9-44b6-b0a5-b1532b4a0a31" containerID="fb34c82f5b3e748bcf461b8c680f622fbd3d3b441013c66d1fb770a45f687c55" exitCode=0 Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.242225 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3963d293-d9e9-44b6-b0a5-b1532b4a0a31","Type":"ContainerDied","Data":"fb34c82f5b3e748bcf461b8c680f622fbd3d3b441013c66d1fb770a45f687c55"} Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.242251 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3963d293-d9e9-44b6-b0a5-b1532b4a0a31","Type":"ContainerDied","Data":"d103828190b73cd66b52e3c87baabe0626b7b15848c46ec255fb053470608b21"} Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.242254 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.284225 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5c79966c-pthdw"] Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.305451 4941 scope.go:117] "RemoveContainer" containerID="b109f7f57212d8e8c114a9a627a5a319764e1da9f3d67fde616e4969dbfc95a1" Mar 07 07:17:18 crc kubenswrapper[4941]: E0307 07:17:18.306759 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b109f7f57212d8e8c114a9a627a5a319764e1da9f3d67fde616e4969dbfc95a1\": container with ID starting with b109f7f57212d8e8c114a9a627a5a319764e1da9f3d67fde616e4969dbfc95a1 not found: ID does not exist" containerID="b109f7f57212d8e8c114a9a627a5a319764e1da9f3d67fde616e4969dbfc95a1" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.306802 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b109f7f57212d8e8c114a9a627a5a319764e1da9f3d67fde616e4969dbfc95a1"} err="failed to get container status \"b109f7f57212d8e8c114a9a627a5a319764e1da9f3d67fde616e4969dbfc95a1\": rpc error: code = NotFound desc = could not find container \"b109f7f57212d8e8c114a9a627a5a319764e1da9f3d67fde616e4969dbfc95a1\": container with ID starting with b109f7f57212d8e8c114a9a627a5a319764e1da9f3d67fde616e4969dbfc95a1 not found: ID does not exist" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.306824 4941 scope.go:117] "RemoveContainer" containerID="fb34c82f5b3e748bcf461b8c680f622fbd3d3b441013c66d1fb770a45f687c55" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.309368 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5c79966c-pthdw"] Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.319563 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.323828 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.336890 4941 scope.go:117] "RemoveContainer" containerID="c2e50e54d812cc28e53051fff65c9444476dee98164deae3a30d1ddc3f4e4e86" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.374550 4941 scope.go:117] "RemoveContainer" containerID="fb34c82f5b3e748bcf461b8c680f622fbd3d3b441013c66d1fb770a45f687c55" Mar 07 07:17:18 crc kubenswrapper[4941]: E0307 07:17:18.374957 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb34c82f5b3e748bcf461b8c680f622fbd3d3b441013c66d1fb770a45f687c55\": container with ID starting with fb34c82f5b3e748bcf461b8c680f622fbd3d3b441013c66d1fb770a45f687c55 not found: ID does not exist" containerID="fb34c82f5b3e748bcf461b8c680f622fbd3d3b441013c66d1fb770a45f687c55" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.374993 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb34c82f5b3e748bcf461b8c680f622fbd3d3b441013c66d1fb770a45f687c55"} err="failed to get container status \"fb34c82f5b3e748bcf461b8c680f622fbd3d3b441013c66d1fb770a45f687c55\": rpc error: code = NotFound desc = could not find container \"fb34c82f5b3e748bcf461b8c680f622fbd3d3b441013c66d1fb770a45f687c55\": container with ID starting with fb34c82f5b3e748bcf461b8c680f622fbd3d3b441013c66d1fb770a45f687c55 not found: ID does not exist" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.375046 4941 scope.go:117] "RemoveContainer" containerID="c2e50e54d812cc28e53051fff65c9444476dee98164deae3a30d1ddc3f4e4e86" Mar 07 07:17:18 crc kubenswrapper[4941]: E0307 07:17:18.375391 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2e50e54d812cc28e53051fff65c9444476dee98164deae3a30d1ddc3f4e4e86\": container with ID starting with c2e50e54d812cc28e53051fff65c9444476dee98164deae3a30d1ddc3f4e4e86 not found: ID does not exist" containerID="c2e50e54d812cc28e53051fff65c9444476dee98164deae3a30d1ddc3f4e4e86" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.375435 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2e50e54d812cc28e53051fff65c9444476dee98164deae3a30d1ddc3f4e4e86"} err="failed to get container status \"c2e50e54d812cc28e53051fff65c9444476dee98164deae3a30d1ddc3f4e4e86\": rpc error: code = NotFound desc = could not find container \"c2e50e54d812cc28e53051fff65c9444476dee98164deae3a30d1ddc3f4e4e86\": container with ID starting with c2e50e54d812cc28e53051fff65c9444476dee98164deae3a30d1ddc3f4e4e86 not found: ID does not exist" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.450773 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.554861 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-erlang-cookie\") pod \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.555299 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-confd\") pod \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.555335 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.555372 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-plugins\") pod \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.555435 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nvs7\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-kube-api-access-6nvs7\") pod \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.555482 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-pod-info\") pod \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.555528 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-server-conf\") pod \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.555562 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-tls\") pod \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.555595 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-erlang-cookie-secret\") pod \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.555630 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-plugins-conf\") pod \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.555750 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data\") pod \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\" (UID: \"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670\") " Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.555795 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.556273 4941 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.558292 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.558665 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.561439 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.561966 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-pod-info" (OuterVolumeSpecName: "pod-info") pod "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.562129 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.565026 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-kube-api-access-6nvs7" (OuterVolumeSpecName: "kube-api-access-6nvs7") pod "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670"). InnerVolumeSpecName "kube-api-access-6nvs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.565663 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.574870 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data" (OuterVolumeSpecName: "config-data") pod "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.626236 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-server-conf" (OuterVolumeSpecName: "server-conf") pod "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.657208 4941 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.657247 4941 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.657258 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nvs7\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-kube-api-access-6nvs7\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.657268 4941 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-pod-info\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.657276 4941 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-server-conf\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.657285 4941 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.657293 4941 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.657300 4941 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.657308 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.658242 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" (UID: "aeb1dd04-5b8c-49b4-bf65-be38fb8ae670"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.671687 4941 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.758769 4941 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: I0307 07:17:18.758813 4941 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:18 crc kubenswrapper[4941]: E0307 07:17:18.851893 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 07 07:17:18 crc kubenswrapper[4941]: E0307 07:17:18.853384 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 07 07:17:18 crc kubenswrapper[4941]: E0307 07:17:18.855054 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 07 07:17:18 crc kubenswrapper[4941]: E0307 07:17:18.855130 4941 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="6da1ac3d-bac5-4c8e-a920-6b6dff25fd20" containerName="nova-cell0-conductor-conductor" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.027528 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.043893 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.063694 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twnms\" (UniqueName: \"kubernetes.io/projected/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-kube-api-access-twnms\") pod \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.063767 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-config-data\") pod \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.063854 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-logs\") pod \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.063878 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-combined-ca-bundle\") pod \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.063949 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-config-data-custom\") pod \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\" (UID: \"ea4583b7-29d7-466d-8c3d-ad9981ebc66d\") " Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.064521 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-logs" (OuterVolumeSpecName: "logs") pod "ea4583b7-29d7-466d-8c3d-ad9981ebc66d" (UID: "ea4583b7-29d7-466d-8c3d-ad9981ebc66d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.068141 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-kube-api-access-twnms" (OuterVolumeSpecName: "kube-api-access-twnms") pod "ea4583b7-29d7-466d-8c3d-ad9981ebc66d" (UID: "ea4583b7-29d7-466d-8c3d-ad9981ebc66d"). InnerVolumeSpecName "kube-api-access-twnms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.071990 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ea4583b7-29d7-466d-8c3d-ad9981ebc66d" (UID: "ea4583b7-29d7-466d-8c3d-ad9981ebc66d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.100253 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea4583b7-29d7-466d-8c3d-ad9981ebc66d" (UID: "ea4583b7-29d7-466d-8c3d-ad9981ebc66d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.124771 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-config-data" (OuterVolumeSpecName: "config-data") pod "ea4583b7-29d7-466d-8c3d-ad9981ebc66d" (UID: "ea4583b7-29d7-466d-8c3d-ad9981ebc66d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.164785 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-combined-ca-bundle\") pod \"e27683db-592f-485a-93b3-93273e1644c3\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.165025 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e27683db-592f-485a-93b3-93273e1644c3-logs\") pod \"e27683db-592f-485a-93b3-93273e1644c3\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.165139 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-config-data\") pod \"e27683db-592f-485a-93b3-93273e1644c3\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.165246 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-config-data-custom\") pod \"e27683db-592f-485a-93b3-93273e1644c3\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.165474 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e27683db-592f-485a-93b3-93273e1644c3-logs" (OuterVolumeSpecName: "logs") pod "e27683db-592f-485a-93b3-93273e1644c3" (UID: "e27683db-592f-485a-93b3-93273e1644c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.165731 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4l9f\" (UniqueName: \"kubernetes.io/projected/e27683db-592f-485a-93b3-93273e1644c3-kube-api-access-l4l9f\") pod \"e27683db-592f-485a-93b3-93273e1644c3\" (UID: \"e27683db-592f-485a-93b3-93273e1644c3\") " Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.166112 4941 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.166128 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twnms\" (UniqueName: \"kubernetes.io/projected/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-kube-api-access-twnms\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.166139 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.166148 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.166156 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4583b7-29d7-466d-8c3d-ad9981ebc66d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.166163 4941 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e27683db-592f-485a-93b3-93273e1644c3-logs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.168135 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e27683db-592f-485a-93b3-93273e1644c3-kube-api-access-l4l9f" (OuterVolumeSpecName: "kube-api-access-l4l9f") pod "e27683db-592f-485a-93b3-93273e1644c3" (UID: "e27683db-592f-485a-93b3-93273e1644c3"). InnerVolumeSpecName "kube-api-access-l4l9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.171569 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e27683db-592f-485a-93b3-93273e1644c3" (UID: "e27683db-592f-485a-93b3-93273e1644c3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.186840 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e27683db-592f-485a-93b3-93273e1644c3" (UID: "e27683db-592f-485a-93b3-93273e1644c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.206334 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-config-data" (OuterVolumeSpecName: "config-data") pod "e27683db-592f-485a-93b3-93273e1644c3" (UID: "e27683db-592f-485a-93b3-93273e1644c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.253601 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aeb1dd04-5b8c-49b4-bf65-be38fb8ae670","Type":"ContainerDied","Data":"f35c2527ef2f2e6926df80b3a68dfacaba9b01bed657fc4d575095c5fe323e92"} Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.253655 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.253662 4941 scope.go:117] "RemoveContainer" containerID="f7a8e765543e88a1c6e7d28463ec6d1148163252cc8cc4989b9a46a6cdfd7693" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.255612 4941 generic.go:334] "Generic (PLEG): container finished" podID="e27683db-592f-485a-93b3-93273e1644c3" containerID="1c3a413ba405f95e0a0014d1315bc4171811b9889fe35c0e478a43aa0b412bf3" exitCode=0 Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.255691 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57855ff457-mshjt" event={"ID":"e27683db-592f-485a-93b3-93273e1644c3","Type":"ContainerDied","Data":"1c3a413ba405f95e0a0014d1315bc4171811b9889fe35c0e478a43aa0b412bf3"} Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.255716 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57855ff457-mshjt" event={"ID":"e27683db-592f-485a-93b3-93273e1644c3","Type":"ContainerDied","Data":"8343bff8e6b25be2f73ec6dd7a92169c700eb64d11432479ec75e91e67910eaa"} Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.255758 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57855ff457-mshjt" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.267379 4941 generic.go:334] "Generic (PLEG): container finished" podID="ea4583b7-29d7-466d-8c3d-ad9981ebc66d" containerID="22c1b092d3e9a8bdc4399876e578f0c462e9f77a91ed13354c75ce1d43091380" exitCode=0 Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.267437 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" event={"ID":"ea4583b7-29d7-466d-8c3d-ad9981ebc66d","Type":"ContainerDied","Data":"22c1b092d3e9a8bdc4399876e578f0c462e9f77a91ed13354c75ce1d43091380"} Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.267468 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" event={"ID":"ea4583b7-29d7-466d-8c3d-ad9981ebc66d","Type":"ContainerDied","Data":"47388a5e4567c5d86ee5a3b0900abeeb98ae104b94c78f56ce15948eaf6de1b5"} Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.267530 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-86888b7b66-mgpdx" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.267976 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.267994 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.268006 4941 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e27683db-592f-485a-93b3-93273e1644c3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.268020 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4l9f\" (UniqueName: \"kubernetes.io/projected/e27683db-592f-485a-93b3-93273e1644c3-kube-api-access-l4l9f\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.295262 4941 scope.go:117] "RemoveContainer" containerID="a12662163e378ef0047a7d0c3ffc76b2214269655c2741ec69ff5a13c078ddf4" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.317555 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-57855ff457-mshjt"] Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.324473 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-57855ff457-mshjt"] Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.329112 4941 scope.go:117] "RemoveContainer" containerID="1c3a413ba405f95e0a0014d1315bc4171811b9889fe35c0e478a43aa0b412bf3" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.338468 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-86888b7b66-mgpdx"] Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.351074 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-86888b7b66-mgpdx"] Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.359801 4941 scope.go:117] "RemoveContainer" containerID="5ed5ef6d5be2531a2b10e8ac859bcb1d8a560dd65521a1b989c82ce3f9e87c02" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.361225 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.371031 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.386990 4941 scope.go:117] "RemoveContainer" containerID="1c3a413ba405f95e0a0014d1315bc4171811b9889fe35c0e478a43aa0b412bf3" Mar 07 07:17:19 crc kubenswrapper[4941]: E0307 07:17:19.387563 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c3a413ba405f95e0a0014d1315bc4171811b9889fe35c0e478a43aa0b412bf3\": container with ID starting with 1c3a413ba405f95e0a0014d1315bc4171811b9889fe35c0e478a43aa0b412bf3 not found: ID does not exist" containerID="1c3a413ba405f95e0a0014d1315bc4171811b9889fe35c0e478a43aa0b412bf3" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.387595 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3a413ba405f95e0a0014d1315bc4171811b9889fe35c0e478a43aa0b412bf3"} err="failed to get container status \"1c3a413ba405f95e0a0014d1315bc4171811b9889fe35c0e478a43aa0b412bf3\": rpc error: code = NotFound desc = could not find container \"1c3a413ba405f95e0a0014d1315bc4171811b9889fe35c0e478a43aa0b412bf3\": container with ID starting with 1c3a413ba405f95e0a0014d1315bc4171811b9889fe35c0e478a43aa0b412bf3 not found: ID does not exist" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.387616 4941 scope.go:117] "RemoveContainer" containerID="5ed5ef6d5be2531a2b10e8ac859bcb1d8a560dd65521a1b989c82ce3f9e87c02" Mar 07 07:17:19 crc kubenswrapper[4941]: E0307 07:17:19.387941 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed5ef6d5be2531a2b10e8ac859bcb1d8a560dd65521a1b989c82ce3f9e87c02\": container with ID starting with 5ed5ef6d5be2531a2b10e8ac859bcb1d8a560dd65521a1b989c82ce3f9e87c02 not found: ID does not exist" containerID="5ed5ef6d5be2531a2b10e8ac859bcb1d8a560dd65521a1b989c82ce3f9e87c02" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.387959 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed5ef6d5be2531a2b10e8ac859bcb1d8a560dd65521a1b989c82ce3f9e87c02"} err="failed to get container status \"5ed5ef6d5be2531a2b10e8ac859bcb1d8a560dd65521a1b989c82ce3f9e87c02\": rpc error: code = NotFound desc = could not find container \"5ed5ef6d5be2531a2b10e8ac859bcb1d8a560dd65521a1b989c82ce3f9e87c02\": container with ID starting with 5ed5ef6d5be2531a2b10e8ac859bcb1d8a560dd65521a1b989c82ce3f9e87c02 not found: ID does not exist" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.387973 4941 scope.go:117] "RemoveContainer" containerID="22c1b092d3e9a8bdc4399876e578f0c462e9f77a91ed13354c75ce1d43091380" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.414539 4941 scope.go:117] "RemoveContainer" containerID="3303b50f68a521a17ce6ea4bdfdd7f14e530a5feea2f3c7f904b8d9a9f946eb3" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.431058 4941 scope.go:117] "RemoveContainer" containerID="22c1b092d3e9a8bdc4399876e578f0c462e9f77a91ed13354c75ce1d43091380" Mar 07 07:17:19 crc kubenswrapper[4941]: E0307 07:17:19.431515 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c1b092d3e9a8bdc4399876e578f0c462e9f77a91ed13354c75ce1d43091380\": container with ID starting with 22c1b092d3e9a8bdc4399876e578f0c462e9f77a91ed13354c75ce1d43091380 not found: ID does not exist" containerID="22c1b092d3e9a8bdc4399876e578f0c462e9f77a91ed13354c75ce1d43091380" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.431655 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c1b092d3e9a8bdc4399876e578f0c462e9f77a91ed13354c75ce1d43091380"} err="failed to get container status \"22c1b092d3e9a8bdc4399876e578f0c462e9f77a91ed13354c75ce1d43091380\": rpc error: code = NotFound desc = could not find container \"22c1b092d3e9a8bdc4399876e578f0c462e9f77a91ed13354c75ce1d43091380\": container with ID starting with 22c1b092d3e9a8bdc4399876e578f0c462e9f77a91ed13354c75ce1d43091380 not found: ID does not exist" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.431767 4941 scope.go:117] "RemoveContainer" containerID="3303b50f68a521a17ce6ea4bdfdd7f14e530a5feea2f3c7f904b8d9a9f946eb3" Mar 07 07:17:19 crc kubenswrapper[4941]: E0307 07:17:19.432314 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3303b50f68a521a17ce6ea4bdfdd7f14e530a5feea2f3c7f904b8d9a9f946eb3\": container with ID starting with 3303b50f68a521a17ce6ea4bdfdd7f14e530a5feea2f3c7f904b8d9a9f946eb3 not found: ID does not exist" containerID="3303b50f68a521a17ce6ea4bdfdd7f14e530a5feea2f3c7f904b8d9a9f946eb3" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.432352 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3303b50f68a521a17ce6ea4bdfdd7f14e530a5feea2f3c7f904b8d9a9f946eb3"} err="failed to get container status \"3303b50f68a521a17ce6ea4bdfdd7f14e530a5feea2f3c7f904b8d9a9f946eb3\": rpc error: code = NotFound desc = could not find container \"3303b50f68a521a17ce6ea4bdfdd7f14e530a5feea2f3c7f904b8d9a9f946eb3\": container with ID starting with 3303b50f68a521a17ce6ea4bdfdd7f14e530a5feea2f3c7f904b8d9a9f946eb3 not found: ID does not exist" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.894333 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.967133 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3963d293-d9e9-44b6-b0a5-b1532b4a0a31" path="/var/lib/kubelet/pods/3963d293-d9e9-44b6-b0a5-b1532b4a0a31/volumes" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.968116 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" path="/var/lib/kubelet/pods/aeb1dd04-5b8c-49b4-bf65-be38fb8ae670/volumes" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.969335 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" path="/var/lib/kubelet/pods/b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b/volumes" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.970001 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e27683db-592f-485a-93b3-93273e1644c3" path="/var/lib/kubelet/pods/e27683db-592f-485a-93b3-93273e1644c3/volumes" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.970790 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea4583b7-29d7-466d-8c3d-ad9981ebc66d" path="/var/lib/kubelet/pods/ea4583b7-29d7-466d-8c3d-ad9981ebc66d/volumes" Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.986617 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943d63f3-758d-4884-8086-93defd44f58a-config-data\") pod \"943d63f3-758d-4884-8086-93defd44f58a\" (UID: \"943d63f3-758d-4884-8086-93defd44f58a\") " Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.986697 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d63f3-758d-4884-8086-93defd44f58a-combined-ca-bundle\") pod \"943d63f3-758d-4884-8086-93defd44f58a\" (UID: \"943d63f3-758d-4884-8086-93defd44f58a\") " Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.986781 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtbwv\" (UniqueName: \"kubernetes.io/projected/943d63f3-758d-4884-8086-93defd44f58a-kube-api-access-rtbwv\") pod \"943d63f3-758d-4884-8086-93defd44f58a\" (UID: \"943d63f3-758d-4884-8086-93defd44f58a\") " Mar 07 07:17:19 crc kubenswrapper[4941]: I0307 07:17:19.993575 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943d63f3-758d-4884-8086-93defd44f58a-kube-api-access-rtbwv" (OuterVolumeSpecName: "kube-api-access-rtbwv") pod "943d63f3-758d-4884-8086-93defd44f58a" (UID: "943d63f3-758d-4884-8086-93defd44f58a"). InnerVolumeSpecName "kube-api-access-rtbwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.017282 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943d63f3-758d-4884-8086-93defd44f58a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "943d63f3-758d-4884-8086-93defd44f58a" (UID: "943d63f3-758d-4884-8086-93defd44f58a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.030553 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943d63f3-758d-4884-8086-93defd44f58a-config-data" (OuterVolumeSpecName: "config-data") pod "943d63f3-758d-4884-8086-93defd44f58a" (UID: "943d63f3-758d-4884-8086-93defd44f58a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.088770 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d63f3-758d-4884-8086-93defd44f58a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.088796 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtbwv\" (UniqueName: \"kubernetes.io/projected/943d63f3-758d-4884-8086-93defd44f58a-kube-api-access-rtbwv\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.088825 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/943d63f3-758d-4884-8086-93defd44f58a-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.132962 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.189359 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-config-data\") pod \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\" (UID: \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\") " Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.189506 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wv4m\" (UniqueName: \"kubernetes.io/projected/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-kube-api-access-5wv4m\") pod \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\" (UID: \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\") " Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.189578 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-combined-ca-bundle\") pod \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\" (UID: \"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20\") " Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.194667 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-kube-api-access-5wv4m" (OuterVolumeSpecName: "kube-api-access-5wv4m") pod "6da1ac3d-bac5-4c8e-a920-6b6dff25fd20" (UID: "6da1ac3d-bac5-4c8e-a920-6b6dff25fd20"). InnerVolumeSpecName "kube-api-access-5wv4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.214334 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6da1ac3d-bac5-4c8e-a920-6b6dff25fd20" (UID: "6da1ac3d-bac5-4c8e-a920-6b6dff25fd20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.215412 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-config-data" (OuterVolumeSpecName: "config-data") pod "6da1ac3d-bac5-4c8e-a920-6b6dff25fd20" (UID: "6da1ac3d-bac5-4c8e-a920-6b6dff25fd20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.299770 4941 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.300122 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wv4m\" (UniqueName: \"kubernetes.io/projected/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-kube-api-access-5wv4m\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.300135 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.315906 4941 generic.go:334] "Generic (PLEG): container finished" podID="6da1ac3d-bac5-4c8e-a920-6b6dff25fd20" containerID="d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e" exitCode=0 Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.315968 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20","Type":"ContainerDied","Data":"d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e"} Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.315991 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6da1ac3d-bac5-4c8e-a920-6b6dff25fd20","Type":"ContainerDied","Data":"ba2cd2c70b60073262a137329b590ea441c9826a1e5ac8aec0621a560eaf1650"} Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.316008 4941 scope.go:117] "RemoveContainer" containerID="d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.316119 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.326790 4941 generic.go:334] "Generic (PLEG): container finished" podID="943d63f3-758d-4884-8086-93defd44f58a" containerID="272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97" exitCode=0 Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.326842 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"943d63f3-758d-4884-8086-93defd44f58a","Type":"ContainerDied","Data":"272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97"} Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.326869 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"943d63f3-758d-4884-8086-93defd44f58a","Type":"ContainerDied","Data":"4695c817ce463aa48ad28a254de7a04f00fe7c0d74c1bf91c8e5ef6ead291855"} Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.327049 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.340669 4941 scope.go:117] "RemoveContainer" containerID="d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e" Mar 07 07:17:20 crc kubenswrapper[4941]: E0307 07:17:20.341214 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e\": container with ID starting with d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e not found: ID does not exist" containerID="d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.341266 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e"} err="failed to get container status \"d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e\": rpc error: code = NotFound desc = could not find container \"d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e\": container with ID starting with d1fe3142016b198d70d0f3c451bae823ebcc7876f0ba723c176d23bfecb2cc9e not found: ID does not exist" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.341298 4941 scope.go:117] "RemoveContainer" containerID="272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.361620 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.367600 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.371170 4941 scope.go:117] "RemoveContainer" containerID="272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97" Mar 07 07:17:20 crc kubenswrapper[4941]: E0307 07:17:20.371637 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97\": container with ID starting with 272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97 not found: ID does not exist" containerID="272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.372032 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97"} err="failed to get container status \"272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97\": rpc error: code = NotFound desc = could not find container \"272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97\": container with ID starting with 272257e9f5894a80f82e63fdb6bdb3e13ffb2a996e7376515ee57abf7f697a97 not found: ID does not exist" Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.379391 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 07 07:17:20 crc kubenswrapper[4941]: I0307 07:17:20.385535 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 07 07:17:20 crc kubenswrapper[4941]: E0307 07:17:20.771234 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:20 crc kubenswrapper[4941]: E0307 07:17:20.771884 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:20 crc kubenswrapper[4941]: E0307 07:17:20.772360 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:20 crc kubenswrapper[4941]: E0307 07:17:20.772452 4941 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovsdb-server" Mar 07 07:17:20 crc kubenswrapper[4941]: E0307 07:17:20.772559 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:20 crc kubenswrapper[4941]: E0307 07:17:20.774855 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:20 crc kubenswrapper[4941]: E0307 07:17:20.777662 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:20 crc kubenswrapper[4941]: E0307 07:17:20.777698 4941 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovs-vswitchd" Mar 07 07:17:21 crc kubenswrapper[4941]: I0307 07:17:21.963751 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da1ac3d-bac5-4c8e-a920-6b6dff25fd20" path="/var/lib/kubelet/pods/6da1ac3d-bac5-4c8e-a920-6b6dff25fd20/volumes" Mar 07 07:17:21 crc kubenswrapper[4941]: I0307 07:17:21.965154 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943d63f3-758d-4884-8086-93defd44f58a" path="/var/lib/kubelet/pods/943d63f3-758d-4884-8086-93defd44f58a/volumes" Mar 07 07:17:25 crc kubenswrapper[4941]: E0307 07:17:25.771119 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:25 crc kubenswrapper[4941]: E0307 07:17:25.774853 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:25 crc kubenswrapper[4941]: E0307 07:17:25.774947 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:25 crc kubenswrapper[4941]: E0307 07:17:25.776041 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:25 crc kubenswrapper[4941]: E0307 07:17:25.776092 4941 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovsdb-server" Mar 07 07:17:25 crc kubenswrapper[4941]: E0307 07:17:25.777177 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:25 crc kubenswrapper[4941]: E0307 07:17:25.779257 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:25 crc kubenswrapper[4941]: E0307 07:17:25.779310 4941 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovs-vswitchd" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.056731 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.171993 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27jhd\" (UniqueName: \"kubernetes.io/projected/d3cb3645-4e27-450f-a712-f656dfa9e8e1-kube-api-access-27jhd\") pod \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.172062 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-public-tls-certs\") pod \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.172099 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-ovndb-tls-certs\") pod \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.172158 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-internal-tls-certs\") pod \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.172206 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-combined-ca-bundle\") pod \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.172234 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-httpd-config\") pod \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.172257 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-config\") pod \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\" (UID: \"d3cb3645-4e27-450f-a712-f656dfa9e8e1\") " Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.178621 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d3cb3645-4e27-450f-a712-f656dfa9e8e1" (UID: "d3cb3645-4e27-450f-a712-f656dfa9e8e1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.178664 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3cb3645-4e27-450f-a712-f656dfa9e8e1-kube-api-access-27jhd" (OuterVolumeSpecName: "kube-api-access-27jhd") pod "d3cb3645-4e27-450f-a712-f656dfa9e8e1" (UID: "d3cb3645-4e27-450f-a712-f656dfa9e8e1"). InnerVolumeSpecName "kube-api-access-27jhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.204686 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d3cb3645-4e27-450f-a712-f656dfa9e8e1" (UID: "d3cb3645-4e27-450f-a712-f656dfa9e8e1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.212404 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d3cb3645-4e27-450f-a712-f656dfa9e8e1" (UID: "d3cb3645-4e27-450f-a712-f656dfa9e8e1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.220350 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-config" (OuterVolumeSpecName: "config") pod "d3cb3645-4e27-450f-a712-f656dfa9e8e1" (UID: "d3cb3645-4e27-450f-a712-f656dfa9e8e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.232298 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3cb3645-4e27-450f-a712-f656dfa9e8e1" (UID: "d3cb3645-4e27-450f-a712-f656dfa9e8e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.244655 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d3cb3645-4e27-450f-a712-f656dfa9e8e1" (UID: "d3cb3645-4e27-450f-a712-f656dfa9e8e1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.275041 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27jhd\" (UniqueName: \"kubernetes.io/projected/d3cb3645-4e27-450f-a712-f656dfa9e8e1-kube-api-access-27jhd\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.275205 4941 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.275300 4941 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.275382 4941 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.275505 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.275600 4941 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.275687 4941 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3cb3645-4e27-450f-a712-f656dfa9e8e1-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.451912 4941 generic.go:334] "Generic (PLEG): container finished" podID="d3cb3645-4e27-450f-a712-f656dfa9e8e1" containerID="f2776ebe60f7ea12aae1e69439654467871a600e01895f035b01fd91b4e7c2d0" exitCode=0 Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.451954 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c6df5b777-qhsgz" event={"ID":"d3cb3645-4e27-450f-a712-f656dfa9e8e1","Type":"ContainerDied","Data":"f2776ebe60f7ea12aae1e69439654467871a600e01895f035b01fd91b4e7c2d0"} Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.451980 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c6df5b777-qhsgz" event={"ID":"d3cb3645-4e27-450f-a712-f656dfa9e8e1","Type":"ContainerDied","Data":"1e08abf6cfacf4ab3fe67f49f69418d92c7363bf3dc7d19bd2ae73864fbbd87f"} Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.451998 4941 scope.go:117] "RemoveContainer" containerID="02af86707f926f63b94a89e930e5bad157f8952cf5491dcebdef8e9862f1f39e" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.452121 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c6df5b777-qhsgz" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.494488 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c6df5b777-qhsgz"] Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.495751 4941 scope.go:117] "RemoveContainer" containerID="f2776ebe60f7ea12aae1e69439654467871a600e01895f035b01fd91b4e7c2d0" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.499145 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7c6df5b777-qhsgz"] Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.524617 4941 scope.go:117] "RemoveContainer" containerID="02af86707f926f63b94a89e930e5bad157f8952cf5491dcebdef8e9862f1f39e" Mar 07 07:17:29 crc kubenswrapper[4941]: E0307 07:17:29.525076 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02af86707f926f63b94a89e930e5bad157f8952cf5491dcebdef8e9862f1f39e\": container with ID starting with 02af86707f926f63b94a89e930e5bad157f8952cf5491dcebdef8e9862f1f39e not found: ID does not exist" containerID="02af86707f926f63b94a89e930e5bad157f8952cf5491dcebdef8e9862f1f39e" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.525186 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02af86707f926f63b94a89e930e5bad157f8952cf5491dcebdef8e9862f1f39e"} err="failed to get container status \"02af86707f926f63b94a89e930e5bad157f8952cf5491dcebdef8e9862f1f39e\": rpc error: code = NotFound desc = could not find container \"02af86707f926f63b94a89e930e5bad157f8952cf5491dcebdef8e9862f1f39e\": container with ID starting with 02af86707f926f63b94a89e930e5bad157f8952cf5491dcebdef8e9862f1f39e not found: ID does not exist" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.525277 4941 scope.go:117] "RemoveContainer" containerID="f2776ebe60f7ea12aae1e69439654467871a600e01895f035b01fd91b4e7c2d0" Mar 07 07:17:29 crc kubenswrapper[4941]: E0307 07:17:29.525564 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2776ebe60f7ea12aae1e69439654467871a600e01895f035b01fd91b4e7c2d0\": container with ID starting with f2776ebe60f7ea12aae1e69439654467871a600e01895f035b01fd91b4e7c2d0 not found: ID does not exist" containerID="f2776ebe60f7ea12aae1e69439654467871a600e01895f035b01fd91b4e7c2d0" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.525585 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2776ebe60f7ea12aae1e69439654467871a600e01895f035b01fd91b4e7c2d0"} err="failed to get container status \"f2776ebe60f7ea12aae1e69439654467871a600e01895f035b01fd91b4e7c2d0\": rpc error: code = NotFound desc = could not find container \"f2776ebe60f7ea12aae1e69439654467871a600e01895f035b01fd91b4e7c2d0\": container with ID starting with f2776ebe60f7ea12aae1e69439654467871a600e01895f035b01fd91b4e7c2d0 not found: ID does not exist" Mar 07 07:17:29 crc kubenswrapper[4941]: I0307 07:17:29.964296 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3cb3645-4e27-450f-a712-f656dfa9e8e1" path="/var/lib/kubelet/pods/d3cb3645-4e27-450f-a712-f656dfa9e8e1/volumes" Mar 07 07:17:30 crc kubenswrapper[4941]: E0307 07:17:30.770767 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:30 crc kubenswrapper[4941]: E0307 07:17:30.771386 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:30 crc kubenswrapper[4941]: E0307 07:17:30.772172 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:30 crc kubenswrapper[4941]: E0307 07:17:30.772225 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:30 crc kubenswrapper[4941]: E0307 07:17:30.772256 4941 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovsdb-server" Mar 07 07:17:30 crc kubenswrapper[4941]: E0307 07:17:30.773997 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:30 crc kubenswrapper[4941]: E0307 07:17:30.775307 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:30 crc kubenswrapper[4941]: E0307 07:17:30.775344 4941 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovs-vswitchd" Mar 07 07:17:35 crc kubenswrapper[4941]: E0307 07:17:35.770252 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:35 crc kubenswrapper[4941]: E0307 07:17:35.771377 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:35 crc kubenswrapper[4941]: E0307 07:17:35.771669 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:35 crc kubenswrapper[4941]: E0307 07:17:35.771742 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 07 07:17:35 crc kubenswrapper[4941]: E0307 07:17:35.771761 4941 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovsdb-server" Mar 07 07:17:35 crc kubenswrapper[4941]: E0307 07:17:35.772926 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:35 crc kubenswrapper[4941]: E0307 07:17:35.774186 4941 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 07 07:17:35 crc kubenswrapper[4941]: E0307 07:17:35.774212 4941 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-vrr7t" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovs-vswitchd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.314315 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.314918 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.343257 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vrr7t_531af2a1-d934-48a5-b3de-61d475bf252f/ovs-vswitchd/0.log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.343919 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.460213 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-log\") pod \"531af2a1-d934-48a5-b3de-61d475bf252f\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.460279 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-log" (OuterVolumeSpecName: "var-log") pod "531af2a1-d934-48a5-b3de-61d475bf252f" (UID: "531af2a1-d934-48a5-b3de-61d475bf252f"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.460290 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-run\") pod \"531af2a1-d934-48a5-b3de-61d475bf252f\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.460332 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/531af2a1-d934-48a5-b3de-61d475bf252f-scripts\") pod \"531af2a1-d934-48a5-b3de-61d475bf252f\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.460378 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-etc-ovs\") pod \"531af2a1-d934-48a5-b3de-61d475bf252f\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.460380 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-run" (OuterVolumeSpecName: "var-run") pod "531af2a1-d934-48a5-b3de-61d475bf252f" (UID: "531af2a1-d934-48a5-b3de-61d475bf252f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.460472 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8k4m\" (UniqueName: \"kubernetes.io/projected/531af2a1-d934-48a5-b3de-61d475bf252f-kube-api-access-m8k4m\") pod \"531af2a1-d934-48a5-b3de-61d475bf252f\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.460495 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-lib\") pod \"531af2a1-d934-48a5-b3de-61d475bf252f\" (UID: \"531af2a1-d934-48a5-b3de-61d475bf252f\") " Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.460536 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "531af2a1-d934-48a5-b3de-61d475bf252f" (UID: "531af2a1-d934-48a5-b3de-61d475bf252f"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.460713 4941 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-run\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.460725 4941 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.460733 4941 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-log\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.460780 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-lib" (OuterVolumeSpecName: "var-lib") pod "531af2a1-d934-48a5-b3de-61d475bf252f" (UID: "531af2a1-d934-48a5-b3de-61d475bf252f"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.461467 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/531af2a1-d934-48a5-b3de-61d475bf252f-scripts" (OuterVolumeSpecName: "scripts") pod "531af2a1-d934-48a5-b3de-61d475bf252f" (UID: "531af2a1-d934-48a5-b3de-61d475bf252f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.470736 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/531af2a1-d934-48a5-b3de-61d475bf252f-kube-api-access-m8k4m" (OuterVolumeSpecName: "kube-api-access-m8k4m") pod "531af2a1-d934-48a5-b3de-61d475bf252f" (UID: "531af2a1-d934-48a5-b3de-61d475bf252f"). InnerVolumeSpecName "kube-api-access-m8k4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.515291 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.562545 4941 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/531af2a1-d934-48a5-b3de-61d475bf252f-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.562882 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8k4m\" (UniqueName: \"kubernetes.io/projected/531af2a1-d934-48a5-b3de-61d475bf252f-kube-api-access-m8k4m\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.562894 4941 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/531af2a1-d934-48a5-b3de-61d475bf252f-var-lib\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.576460 4941 generic.go:334] "Generic (PLEG): container finished" podID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerID="de3ca98cd25b474d3e353f72d65beb19ff275720c6069dd01c5f7583298062ed" exitCode=137 Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.576663 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"de3ca98cd25b474d3e353f72d65beb19ff275720c6069dd01c5f7583298062ed"} Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.576742 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6a5f223a-7907-42a5-954b-fafc3c4b78da","Type":"ContainerDied","Data":"da1654ea95b5022cff095825806dfbe23192b7b16b6b327904fd76eee3f987d1"} Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.576834 4941 scope.go:117] "RemoveContainer" containerID="de3ca98cd25b474d3e353f72d65beb19ff275720c6069dd01c5f7583298062ed" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.577042 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.580863 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vrr7t_531af2a1-d934-48a5-b3de-61d475bf252f/ovs-vswitchd/0.log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.582552 4941 generic.go:334] "Generic (PLEG): container finished" podID="531af2a1-d934-48a5-b3de-61d475bf252f" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" exitCode=137 Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.582581 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vrr7t" event={"ID":"531af2a1-d934-48a5-b3de-61d475bf252f","Type":"ContainerDied","Data":"bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e"} Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.582601 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vrr7t" event={"ID":"531af2a1-d934-48a5-b3de-61d475bf252f","Type":"ContainerDied","Data":"49c6452edd07d78c7f49664743d4192b47a0b8dc53d1b2b4f64e29bc0dbd0010"} Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.582631 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vrr7t" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.608893 4941 scope.go:117] "RemoveContainer" containerID="85270965dc437741cea69bd912b61c8d09026d3d258607bf435fad06544d3fd6" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.615378 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-vrr7t"] Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.620103 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-vrr7t"] Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.641672 4941 scope.go:117] "RemoveContainer" containerID="855cf93f381f91e003458131c37485c03dad7138c9edead25076ccf72ecf7f52" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.665441 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6a5f223a-7907-42a5-954b-fafc3c4b78da-lock\") pod \"6a5f223a-7907-42a5-954b-fafc3c4b78da\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.665583 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6a5f223a-7907-42a5-954b-fafc3c4b78da-cache\") pod \"6a5f223a-7907-42a5-954b-fafc3c4b78da\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.665609 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4nrl\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-kube-api-access-z4nrl\") pod \"6a5f223a-7907-42a5-954b-fafc3c4b78da\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.665647 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift\") pod \"6a5f223a-7907-42a5-954b-fafc3c4b78da\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.665667 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"6a5f223a-7907-42a5-954b-fafc3c4b78da\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.665713 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a5f223a-7907-42a5-954b-fafc3c4b78da-combined-ca-bundle\") pod \"6a5f223a-7907-42a5-954b-fafc3c4b78da\" (UID: \"6a5f223a-7907-42a5-954b-fafc3c4b78da\") " Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.667175 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a5f223a-7907-42a5-954b-fafc3c4b78da-lock" (OuterVolumeSpecName: "lock") pod "6a5f223a-7907-42a5-954b-fafc3c4b78da" (UID: "6a5f223a-7907-42a5-954b-fafc3c4b78da"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.667588 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a5f223a-7907-42a5-954b-fafc3c4b78da-cache" (OuterVolumeSpecName: "cache") pod "6a5f223a-7907-42a5-954b-fafc3c4b78da" (UID: "6a5f223a-7907-42a5-954b-fafc3c4b78da"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.669455 4941 scope.go:117] "RemoveContainer" containerID="56cd38a5b827100634031f80ea3f6b66bba0e5e1443ae05edcfcbe4fc643efaf" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.670561 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "swift") pod "6a5f223a-7907-42a5-954b-fafc3c4b78da" (UID: "6a5f223a-7907-42a5-954b-fafc3c4b78da"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.670825 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6a5f223a-7907-42a5-954b-fafc3c4b78da" (UID: "6a5f223a-7907-42a5-954b-fafc3c4b78da"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.673226 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-kube-api-access-z4nrl" (OuterVolumeSpecName: "kube-api-access-z4nrl") pod "6a5f223a-7907-42a5-954b-fafc3c4b78da" (UID: "6a5f223a-7907-42a5-954b-fafc3c4b78da"). InnerVolumeSpecName "kube-api-access-z4nrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.693771 4941 scope.go:117] "RemoveContainer" containerID="2e027b31903a84225494031f036fcf7247ad6301c5f5dde58c4ee1c14cce7c11" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.724906 4941 scope.go:117] "RemoveContainer" containerID="abc84bd5c148347fd3e6e9bae1b3e5e71c3cfbd0e165bf4c6c5476ff169250b7" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.749486 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bk5cq"] Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.750555 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1fb4667-396e-44bb-a2ed-e576a9b69be2" containerName="mysql-bootstrap" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.750573 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1fb4667-396e-44bb-a2ed-e576a9b69be2" containerName="mysql-bootstrap" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.750594 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b22d449-d1ec-4bf4-a876-b86a87508580" containerName="memcached" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.750601 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b22d449-d1ec-4bf4-a876-b86a87508580" containerName="memcached" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.750610 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="rsync" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.750618 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="rsync" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.750628 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d72c12-422e-48fd-b56b-8344260e3e01" containerName="glance-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.750637 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d72c12-422e-48fd-b56b-8344260e3e01" containerName="glance-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.750652 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4583b7-29d7-466d-8c3d-ad9981ebc66d" containerName="barbican-keystone-listener" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.750659 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4583b7-29d7-466d-8c3d-ad9981ebc66d" containerName="barbican-keystone-listener" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.750673 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753c78f9-47e6-4098-91fa-9adac0997ba4" containerName="cinder-api-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.750679 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="753c78f9-47e6-4098-91fa-9adac0997ba4" containerName="cinder-api-log" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.750693 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-updater" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.750699 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-updater" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.750708 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" containerName="setup-container" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.750715 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" containerName="setup-container" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.750726 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3cb3645-4e27-450f-a712-f656dfa9e8e1" containerName="neutron-api" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.750733 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3cb3645-4e27-450f-a712-f656dfa9e8e1" containerName="neutron-api" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.750749 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" containerName="nova-api-api" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.750803 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" containerName="nova-api-api" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.750816 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovsdb-server-init" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.750823 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovsdb-server-init" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.750832 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-replicator" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.750839 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-replicator" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.750850 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" containerName="glance-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.750858 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" containerName="glance-log" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751083 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-auditor" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751095 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-auditor" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751106 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="ceilometer-central-agent" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751113 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="ceilometer-central-agent" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751127 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ca3e53-9ebc-464c-ac37-51163b9bc104" containerName="mariadb-account-create-update" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751137 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ca3e53-9ebc-464c-ac37-51163b9bc104" containerName="mariadb-account-create-update" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751147 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c892cbf7-126c-4638-854d-18cef63c7747" containerName="nova-metadata-metadata" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751154 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c892cbf7-126c-4638-854d-18cef63c7747" containerName="nova-metadata-metadata" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751169 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1fb4667-396e-44bb-a2ed-e576a9b69be2" containerName="galera" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751177 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1fb4667-396e-44bb-a2ed-e576a9b69be2" containerName="galera" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751185 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c4b049-d672-41e8-b3cb-09b800f04a19" containerName="kube-state-metrics" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751192 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c4b049-d672-41e8-b3cb-09b800f04a19" containerName="kube-state-metrics" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751203 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-reaper" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751209 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-reaper" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751220 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" containerName="rabbitmq" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751227 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" containerName="rabbitmq" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751234 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="proxy-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751242 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="proxy-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751250 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2b6a75-839f-4fec-9f12-fb520b44c7ce" containerName="placement-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751258 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2b6a75-839f-4fec-9f12-fb520b44c7ce" containerName="placement-log" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751271 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3963d293-d9e9-44b6-b0a5-b1532b4a0a31" containerName="rabbitmq" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751278 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="3963d293-d9e9-44b6-b0a5-b1532b4a0a31" containerName="rabbitmq" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751291 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2b6a75-839f-4fec-9f12-fb520b44c7ce" containerName="placement-api" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751299 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2b6a75-839f-4fec-9f12-fb520b44c7ce" containerName="placement-api" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751305 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753c78f9-47e6-4098-91fa-9adac0997ba4" containerName="cinder-api" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751312 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="753c78f9-47e6-4098-91fa-9adac0997ba4" containerName="cinder-api" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751327 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-updater" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751333 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-updater" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751340 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" containerName="nova-api-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751347 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" containerName="nova-api-log" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751359 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="ceilometer-notification-agent" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751366 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="ceilometer-notification-agent" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751375 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-server" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751382 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-server" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751394 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" containerName="keystone-api" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751404 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" containerName="keystone-api" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751431 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-auditor" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751439 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-auditor" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751449 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3963d293-d9e9-44b6-b0a5-b1532b4a0a31" containerName="setup-container" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751456 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="3963d293-d9e9-44b6-b0a5-b1532b4a0a31" containerName="setup-container" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.751468 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d72c12-422e-48fd-b56b-8344260e3e01" containerName="glance-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.751477 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d72c12-422e-48fd-b56b-8344260e3e01" containerName="glance-log" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.753222 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-replicator" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.753265 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-replicator" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.753282 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3cb3645-4e27-450f-a712-f656dfa9e8e1" containerName="neutron-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.753292 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3cb3645-4e27-450f-a712-f656dfa9e8e1" containerName="neutron-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.753332 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da1ac3d-bac5-4c8e-a920-6b6dff25fd20" containerName="nova-cell0-conductor-conductor" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756617 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da1ac3d-bac5-4c8e-a920-6b6dff25fd20" containerName="nova-cell0-conductor-conductor" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756650 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" containerName="proxy-server" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756658 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" containerName="proxy-server" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756670 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-expirer" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756679 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-expirer" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756691 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757b037d-b7b8-4690-93b9-ec85c5bf82db" containerName="barbican-api" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756697 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="757b037d-b7b8-4690-93b9-ec85c5bf82db" containerName="barbican-api" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756705 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943d63f3-758d-4884-8086-93defd44f58a" containerName="nova-cell1-conductor-conductor" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756712 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="943d63f3-758d-4884-8086-93defd44f58a" containerName="nova-cell1-conductor-conductor" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756724 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="sg-core" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756730 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="sg-core" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756743 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c892cbf7-126c-4638-854d-18cef63c7747" containerName="nova-metadata-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756750 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c892cbf7-126c-4638-854d-18cef63c7747" containerName="nova-metadata-log" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756760 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b306e38-c479-45ff-93ab-ca0e0e6a3aef" containerName="galera" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756766 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b306e38-c479-45ff-93ab-ca0e0e6a3aef" containerName="galera" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756776 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-server" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756782 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-server" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756791 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" containerName="proxy-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756797 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" containerName="proxy-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756807 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ca3e53-9ebc-464c-ac37-51163b9bc104" containerName="mariadb-account-create-update" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756813 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ca3e53-9ebc-464c-ac37-51163b9bc104" containerName="mariadb-account-create-update" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756823 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ad12db-0b25-4e03-8772-de047be41b0d" containerName="ovn-northd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756829 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ad12db-0b25-4e03-8772-de047be41b0d" containerName="ovn-northd" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756836 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-server" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756843 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-server" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756852 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovs-vswitchd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756859 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovs-vswitchd" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756869 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" containerName="glance-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756874 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" containerName="glance-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756882 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovsdb-server" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756893 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovsdb-server" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756902 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-replicator" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756907 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-replicator" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756918 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b306e38-c479-45ff-93ab-ca0e0e6a3aef" containerName="mysql-bootstrap" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756924 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b306e38-c479-45ff-93ab-ca0e0e6a3aef" containerName="mysql-bootstrap" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756932 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757b037d-b7b8-4690-93b9-ec85c5bf82db" containerName="barbican-api-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756937 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="757b037d-b7b8-4690-93b9-ec85c5bf82db" containerName="barbican-api-log" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756946 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="swift-recon-cron" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756952 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="swift-recon-cron" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756962 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e27683db-592f-485a-93b3-93273e1644c3" containerName="barbican-worker-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756968 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27683db-592f-485a-93b3-93273e1644c3" containerName="barbican-worker-log" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.756974 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4583b7-29d7-466d-8c3d-ad9981ebc66d" containerName="barbican-keystone-listener-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.756980 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4583b7-29d7-466d-8c3d-ad9981ebc66d" containerName="barbican-keystone-listener-log" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.757460 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-auditor" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757477 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-auditor" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.757491 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ad12db-0b25-4e03-8772-de047be41b0d" containerName="openstack-network-exporter" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757497 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ad12db-0b25-4e03-8772-de047be41b0d" containerName="openstack-network-exporter" Mar 07 07:17:40 crc kubenswrapper[4941]: E0307 07:17:40.757507 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e27683db-592f-485a-93b3-93273e1644c3" containerName="barbican-worker" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757513 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27683db-592f-485a-93b3-93273e1644c3" containerName="barbican-worker" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757746 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-expirer" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757756 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="757b037d-b7b8-4690-93b9-ec85c5bf82db" containerName="barbican-api-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757763 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="943d63f3-758d-4884-8086-93defd44f58a" containerName="nova-cell1-conductor-conductor" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757771 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="rsync" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757779 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1fb4667-396e-44bb-a2ed-e576a9b69be2" containerName="galera" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757788 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1ad12db-0b25-4e03-8772-de047be41b0d" containerName="openstack-network-exporter" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757795 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-replicator" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757806 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b1bfcb-ef37-4308-b5ef-13ff90b07d4b" containerName="keystone-api" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757817 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2b6a75-839f-4fec-9f12-fb520b44c7ce" containerName="placement-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757825 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-auditor" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757832 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-reaper" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757842 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ca3e53-9ebc-464c-ac37-51163b9bc104" containerName="mariadb-account-create-update" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757852 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4583b7-29d7-466d-8c3d-ad9981ebc66d" containerName="barbican-keystone-listener-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757859 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" containerName="proxy-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757870 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e27683db-592f-485a-93b3-93273e1644c3" containerName="barbican-worker-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757881 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-updater" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757891 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ca3e53-9ebc-464c-ac37-51163b9bc104" containerName="mariadb-account-create-update" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757900 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="3963d293-d9e9-44b6-b0a5-b1532b4a0a31" containerName="rabbitmq" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757909 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb1dd04-5b8c-49b4-bf65-be38fb8ae670" containerName="rabbitmq" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757918 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d72c12-422e-48fd-b56b-8344260e3e01" containerName="glance-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757926 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c892cbf7-126c-4638-854d-18cef63c7747" containerName="nova-metadata-metadata" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757934 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="proxy-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757942 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d72c12-422e-48fd-b56b-8344260e3e01" containerName="glance-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757949 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-server" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757955 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="753c78f9-47e6-4098-91fa-9adac0997ba4" containerName="cinder-api" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757964 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da1ac3d-bac5-4c8e-a920-6b6dff25fd20" containerName="nova-cell0-conductor-conductor" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757973 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="757b037d-b7b8-4690-93b9-ec85c5bf82db" containerName="barbican-api" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757982 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="ceilometer-notification-agent" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757989 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b22d449-d1ec-4bf4-a876-b86a87508580" containerName="memcached" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.757995 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1ad12db-0b25-4e03-8772-de047be41b0d" containerName="ovn-northd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758004 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-auditor" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758010 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="swift-recon-cron" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758020 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="ceilometer-central-agent" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758030 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5e0ad9-b4e5-4307-a381-3a92092a3240" containerName="sg-core" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758038 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-updater" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758047 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" containerName="glance-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758055 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3cb3645-4e27-450f-a712-f656dfa9e8e1" containerName="neutron-httpd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758064 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="927b9eb0-124f-4a2c-86ae-2ea4cbe609e7" containerName="glance-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758074 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-replicator" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758082 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="753c78f9-47e6-4098-91fa-9adac0997ba4" containerName="cinder-api-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758088 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" containerName="nova-api-api" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758098 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c4b049-d672-41e8-b3cb-09b800f04a19" containerName="kube-state-metrics" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758107 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0b4aa9b-9dc9-4e99-87e8-6320c5a81456" containerName="nova-api-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758117 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4583b7-29d7-466d-8c3d-ad9981ebc66d" containerName="barbican-keystone-listener" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758124 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-replicator" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758130 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="account-server" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758139 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="object-server" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758145 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovsdb-server" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758154 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e27683db-592f-485a-93b3-93273e1644c3" containerName="barbican-worker" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758163 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b306e38-c479-45ff-93ab-ca0e0e6a3aef" containerName="galera" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758173 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" containerName="ovs-vswitchd" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758179 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" containerName="container-auditor" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758186 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c892cbf7-126c-4638-854d-18cef63c7747" containerName="nova-metadata-log" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758193 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b2f75a4-a46a-4430-bf4d-d3c2c65d8510" containerName="proxy-server" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758199 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2b6a75-839f-4fec-9f12-fb520b44c7ce" containerName="placement-api" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.758208 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3cb3645-4e27-450f-a712-f656dfa9e8e1" containerName="neutron-api" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.759237 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.759745 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bk5cq"] Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.762576 4941 scope.go:117] "RemoveContainer" containerID="94f7fa69e0150f7b0164f8a024d5ff0ff408147eb7732aa59d194581d6174384" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.767280 4941 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6a5f223a-7907-42a5-954b-fafc3c4b78da-lock\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.767311 4941 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6a5f223a-7907-42a5-954b-fafc3c4b78da-cache\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.767325 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4nrl\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-kube-api-access-z4nrl\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.767340 4941 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6a5f223a-7907-42a5-954b-fafc3c4b78da-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.767367 4941 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.784683 4941 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.795575 4941 scope.go:117] "RemoveContainer" containerID="3a29871d30c27b2ebf022f9baa1deedf75b3ae8ee4831d1770e23d3e54a09010" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.814011 4941 scope.go:117] "RemoveContainer" containerID="76a0c0f3d60ae71295ad9a44c1ccf6e3b855bfb182c9bd98972a884b1b52f6f8" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.838142 4941 scope.go:117] "RemoveContainer" containerID="b2118b6239f5287819f8bf36551a0f5192fed176c1776aa201c973673b7cea6c" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.860588 4941 scope.go:117] "RemoveContainer" containerID="dde308e8b7e70259a3c26190f8e91532136f44b834cef40a52f1a0a3750a50ef" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.868389 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1af0209-ed35-4a7e-9809-5bb513a32d8b-catalog-content\") pod \"community-operators-bk5cq\" (UID: \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\") " pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.868448 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1af0209-ed35-4a7e-9809-5bb513a32d8b-utilities\") pod \"community-operators-bk5cq\" (UID: \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\") " pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.868503 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl8j5\" (UniqueName: \"kubernetes.io/projected/e1af0209-ed35-4a7e-9809-5bb513a32d8b-kube-api-access-nl8j5\") pod \"community-operators-bk5cq\" (UID: \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\") " pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.868589 4941 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.881523 4941 scope.go:117] "RemoveContainer" containerID="b6365958c0e82455e8fe1908fb727efdae94a402d88da1bf60bf89283c9d3a65" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.968592 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a5f223a-7907-42a5-954b-fafc3c4b78da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a5f223a-7907-42a5-954b-fafc3c4b78da" (UID: "6a5f223a-7907-42a5-954b-fafc3c4b78da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.969258 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1af0209-ed35-4a7e-9809-5bb513a32d8b-catalog-content\") pod \"community-operators-bk5cq\" (UID: \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\") " pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.969294 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1af0209-ed35-4a7e-9809-5bb513a32d8b-utilities\") pod \"community-operators-bk5cq\" (UID: \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\") " pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.969328 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl8j5\" (UniqueName: \"kubernetes.io/projected/e1af0209-ed35-4a7e-9809-5bb513a32d8b-kube-api-access-nl8j5\") pod \"community-operators-bk5cq\" (UID: \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\") " pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.969391 4941 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a5f223a-7907-42a5-954b-fafc3c4b78da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.970045 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1af0209-ed35-4a7e-9809-5bb513a32d8b-catalog-content\") pod \"community-operators-bk5cq\" (UID: \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\") " pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.970259 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1af0209-ed35-4a7e-9809-5bb513a32d8b-utilities\") pod \"community-operators-bk5cq\" (UID: \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\") " pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:40 crc kubenswrapper[4941]: I0307 07:17:40.995458 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl8j5\" (UniqueName: \"kubernetes.io/projected/e1af0209-ed35-4a7e-9809-5bb513a32d8b-kube-api-access-nl8j5\") pod \"community-operators-bk5cq\" (UID: \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\") " pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.026452 4941 scope.go:117] "RemoveContainer" containerID="c71a131c16d858150432793989349614354710b468996aed0a90a0a3b4655d57" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.043070 4941 scope.go:117] "RemoveContainer" containerID="39628dd870570071b2bd7ae179bda95dfcba64ea2358713fb6ac97b67dd7f09c" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.067052 4941 scope.go:117] "RemoveContainer" containerID="b9f094df208cb97311346655039b847b135f62d9985a004099e08baeed2fee89" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.085949 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.086164 4941 scope.go:117] "RemoveContainer" containerID="de3ca98cd25b474d3e353f72d65beb19ff275720c6069dd01c5f7583298062ed" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.087934 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3ca98cd25b474d3e353f72d65beb19ff275720c6069dd01c5f7583298062ed\": container with ID starting with de3ca98cd25b474d3e353f72d65beb19ff275720c6069dd01c5f7583298062ed not found: ID does not exist" containerID="de3ca98cd25b474d3e353f72d65beb19ff275720c6069dd01c5f7583298062ed" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.088011 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3ca98cd25b474d3e353f72d65beb19ff275720c6069dd01c5f7583298062ed"} err="failed to get container status \"de3ca98cd25b474d3e353f72d65beb19ff275720c6069dd01c5f7583298062ed\": rpc error: code = NotFound desc = could not find container \"de3ca98cd25b474d3e353f72d65beb19ff275720c6069dd01c5f7583298062ed\": container with ID starting with de3ca98cd25b474d3e353f72d65beb19ff275720c6069dd01c5f7583298062ed not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.088044 4941 scope.go:117] "RemoveContainer" containerID="85270965dc437741cea69bd912b61c8d09026d3d258607bf435fad06544d3fd6" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.088434 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85270965dc437741cea69bd912b61c8d09026d3d258607bf435fad06544d3fd6\": container with ID starting with 85270965dc437741cea69bd912b61c8d09026d3d258607bf435fad06544d3fd6 not found: ID does not exist" containerID="85270965dc437741cea69bd912b61c8d09026d3d258607bf435fad06544d3fd6" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.088465 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85270965dc437741cea69bd912b61c8d09026d3d258607bf435fad06544d3fd6"} err="failed to get container status \"85270965dc437741cea69bd912b61c8d09026d3d258607bf435fad06544d3fd6\": rpc error: code = NotFound desc = could not find container \"85270965dc437741cea69bd912b61c8d09026d3d258607bf435fad06544d3fd6\": container with ID starting with 85270965dc437741cea69bd912b61c8d09026d3d258607bf435fad06544d3fd6 not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.088483 4941 scope.go:117] "RemoveContainer" containerID="855cf93f381f91e003458131c37485c03dad7138c9edead25076ccf72ecf7f52" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.088781 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"855cf93f381f91e003458131c37485c03dad7138c9edead25076ccf72ecf7f52\": container with ID starting with 855cf93f381f91e003458131c37485c03dad7138c9edead25076ccf72ecf7f52 not found: ID does not exist" containerID="855cf93f381f91e003458131c37485c03dad7138c9edead25076ccf72ecf7f52" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.088812 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"855cf93f381f91e003458131c37485c03dad7138c9edead25076ccf72ecf7f52"} err="failed to get container status \"855cf93f381f91e003458131c37485c03dad7138c9edead25076ccf72ecf7f52\": rpc error: code = NotFound desc = could not find container \"855cf93f381f91e003458131c37485c03dad7138c9edead25076ccf72ecf7f52\": container with ID starting with 855cf93f381f91e003458131c37485c03dad7138c9edead25076ccf72ecf7f52 not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.088832 4941 scope.go:117] "RemoveContainer" containerID="56cd38a5b827100634031f80ea3f6b66bba0e5e1443ae05edcfcbe4fc643efaf" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.089228 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56cd38a5b827100634031f80ea3f6b66bba0e5e1443ae05edcfcbe4fc643efaf\": container with ID starting with 56cd38a5b827100634031f80ea3f6b66bba0e5e1443ae05edcfcbe4fc643efaf not found: ID does not exist" containerID="56cd38a5b827100634031f80ea3f6b66bba0e5e1443ae05edcfcbe4fc643efaf" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.089275 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56cd38a5b827100634031f80ea3f6b66bba0e5e1443ae05edcfcbe4fc643efaf"} err="failed to get container status \"56cd38a5b827100634031f80ea3f6b66bba0e5e1443ae05edcfcbe4fc643efaf\": rpc error: code = NotFound desc = could not find container \"56cd38a5b827100634031f80ea3f6b66bba0e5e1443ae05edcfcbe4fc643efaf\": container with ID starting with 56cd38a5b827100634031f80ea3f6b66bba0e5e1443ae05edcfcbe4fc643efaf not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.089309 4941 scope.go:117] "RemoveContainer" containerID="2e027b31903a84225494031f036fcf7247ad6301c5f5dde58c4ee1c14cce7c11" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.089647 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e027b31903a84225494031f036fcf7247ad6301c5f5dde58c4ee1c14cce7c11\": container with ID starting with 2e027b31903a84225494031f036fcf7247ad6301c5f5dde58c4ee1c14cce7c11 not found: ID does not exist" containerID="2e027b31903a84225494031f036fcf7247ad6301c5f5dde58c4ee1c14cce7c11" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.089674 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e027b31903a84225494031f036fcf7247ad6301c5f5dde58c4ee1c14cce7c11"} err="failed to get container status \"2e027b31903a84225494031f036fcf7247ad6301c5f5dde58c4ee1c14cce7c11\": rpc error: code = NotFound desc = could not find container \"2e027b31903a84225494031f036fcf7247ad6301c5f5dde58c4ee1c14cce7c11\": container with ID starting with 2e027b31903a84225494031f036fcf7247ad6301c5f5dde58c4ee1c14cce7c11 not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.089690 4941 scope.go:117] "RemoveContainer" containerID="abc84bd5c148347fd3e6e9bae1b3e5e71c3cfbd0e165bf4c6c5476ff169250b7" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.089951 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc84bd5c148347fd3e6e9bae1b3e5e71c3cfbd0e165bf4c6c5476ff169250b7\": container with ID starting with abc84bd5c148347fd3e6e9bae1b3e5e71c3cfbd0e165bf4c6c5476ff169250b7 not found: ID does not exist" containerID="abc84bd5c148347fd3e6e9bae1b3e5e71c3cfbd0e165bf4c6c5476ff169250b7" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.089981 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc84bd5c148347fd3e6e9bae1b3e5e71c3cfbd0e165bf4c6c5476ff169250b7"} err="failed to get container status \"abc84bd5c148347fd3e6e9bae1b3e5e71c3cfbd0e165bf4c6c5476ff169250b7\": rpc error: code = NotFound desc = could not find container \"abc84bd5c148347fd3e6e9bae1b3e5e71c3cfbd0e165bf4c6c5476ff169250b7\": container with ID starting with abc84bd5c148347fd3e6e9bae1b3e5e71c3cfbd0e165bf4c6c5476ff169250b7 not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.089998 4941 scope.go:117] "RemoveContainer" containerID="94f7fa69e0150f7b0164f8a024d5ff0ff408147eb7732aa59d194581d6174384" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.090251 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94f7fa69e0150f7b0164f8a024d5ff0ff408147eb7732aa59d194581d6174384\": container with ID starting with 94f7fa69e0150f7b0164f8a024d5ff0ff408147eb7732aa59d194581d6174384 not found: ID does not exist" containerID="94f7fa69e0150f7b0164f8a024d5ff0ff408147eb7732aa59d194581d6174384" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.090275 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94f7fa69e0150f7b0164f8a024d5ff0ff408147eb7732aa59d194581d6174384"} err="failed to get container status \"94f7fa69e0150f7b0164f8a024d5ff0ff408147eb7732aa59d194581d6174384\": rpc error: code = NotFound desc = could not find container \"94f7fa69e0150f7b0164f8a024d5ff0ff408147eb7732aa59d194581d6174384\": container with ID starting with 94f7fa69e0150f7b0164f8a024d5ff0ff408147eb7732aa59d194581d6174384 not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.090291 4941 scope.go:117] "RemoveContainer" containerID="3a29871d30c27b2ebf022f9baa1deedf75b3ae8ee4831d1770e23d3e54a09010" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.090594 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a29871d30c27b2ebf022f9baa1deedf75b3ae8ee4831d1770e23d3e54a09010\": container with ID starting with 3a29871d30c27b2ebf022f9baa1deedf75b3ae8ee4831d1770e23d3e54a09010 not found: ID does not exist" containerID="3a29871d30c27b2ebf022f9baa1deedf75b3ae8ee4831d1770e23d3e54a09010" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.090618 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a29871d30c27b2ebf022f9baa1deedf75b3ae8ee4831d1770e23d3e54a09010"} err="failed to get container status \"3a29871d30c27b2ebf022f9baa1deedf75b3ae8ee4831d1770e23d3e54a09010\": rpc error: code = NotFound desc = could not find container \"3a29871d30c27b2ebf022f9baa1deedf75b3ae8ee4831d1770e23d3e54a09010\": container with ID starting with 3a29871d30c27b2ebf022f9baa1deedf75b3ae8ee4831d1770e23d3e54a09010 not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.090634 4941 scope.go:117] "RemoveContainer" containerID="76a0c0f3d60ae71295ad9a44c1ccf6e3b855bfb182c9bd98972a884b1b52f6f8" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.090868 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76a0c0f3d60ae71295ad9a44c1ccf6e3b855bfb182c9bd98972a884b1b52f6f8\": container with ID starting with 76a0c0f3d60ae71295ad9a44c1ccf6e3b855bfb182c9bd98972a884b1b52f6f8 not found: ID does not exist" containerID="76a0c0f3d60ae71295ad9a44c1ccf6e3b855bfb182c9bd98972a884b1b52f6f8" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.090891 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76a0c0f3d60ae71295ad9a44c1ccf6e3b855bfb182c9bd98972a884b1b52f6f8"} err="failed to get container status \"76a0c0f3d60ae71295ad9a44c1ccf6e3b855bfb182c9bd98972a884b1b52f6f8\": rpc error: code = NotFound desc = could not find container \"76a0c0f3d60ae71295ad9a44c1ccf6e3b855bfb182c9bd98972a884b1b52f6f8\": container with ID starting with 76a0c0f3d60ae71295ad9a44c1ccf6e3b855bfb182c9bd98972a884b1b52f6f8 not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.090911 4941 scope.go:117] "RemoveContainer" containerID="b2118b6239f5287819f8bf36551a0f5192fed176c1776aa201c973673b7cea6c" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.091149 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2118b6239f5287819f8bf36551a0f5192fed176c1776aa201c973673b7cea6c\": container with ID starting with b2118b6239f5287819f8bf36551a0f5192fed176c1776aa201c973673b7cea6c not found: ID does not exist" containerID="b2118b6239f5287819f8bf36551a0f5192fed176c1776aa201c973673b7cea6c" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.091171 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2118b6239f5287819f8bf36551a0f5192fed176c1776aa201c973673b7cea6c"} err="failed to get container status \"b2118b6239f5287819f8bf36551a0f5192fed176c1776aa201c973673b7cea6c\": rpc error: code = NotFound desc = could not find container \"b2118b6239f5287819f8bf36551a0f5192fed176c1776aa201c973673b7cea6c\": container with ID starting with b2118b6239f5287819f8bf36551a0f5192fed176c1776aa201c973673b7cea6c not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.091188 4941 scope.go:117] "RemoveContainer" containerID="dde308e8b7e70259a3c26190f8e91532136f44b834cef40a52f1a0a3750a50ef" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.091446 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde308e8b7e70259a3c26190f8e91532136f44b834cef40a52f1a0a3750a50ef\": container with ID starting with dde308e8b7e70259a3c26190f8e91532136f44b834cef40a52f1a0a3750a50ef not found: ID does not exist" containerID="dde308e8b7e70259a3c26190f8e91532136f44b834cef40a52f1a0a3750a50ef" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.091473 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde308e8b7e70259a3c26190f8e91532136f44b834cef40a52f1a0a3750a50ef"} err="failed to get container status \"dde308e8b7e70259a3c26190f8e91532136f44b834cef40a52f1a0a3750a50ef\": rpc error: code = NotFound desc = could not find container \"dde308e8b7e70259a3c26190f8e91532136f44b834cef40a52f1a0a3750a50ef\": container with ID starting with dde308e8b7e70259a3c26190f8e91532136f44b834cef40a52f1a0a3750a50ef not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.091493 4941 scope.go:117] "RemoveContainer" containerID="b6365958c0e82455e8fe1908fb727efdae94a402d88da1bf60bf89283c9d3a65" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.091784 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6365958c0e82455e8fe1908fb727efdae94a402d88da1bf60bf89283c9d3a65\": container with ID starting with b6365958c0e82455e8fe1908fb727efdae94a402d88da1bf60bf89283c9d3a65 not found: ID does not exist" containerID="b6365958c0e82455e8fe1908fb727efdae94a402d88da1bf60bf89283c9d3a65" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.091828 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6365958c0e82455e8fe1908fb727efdae94a402d88da1bf60bf89283c9d3a65"} err="failed to get container status \"b6365958c0e82455e8fe1908fb727efdae94a402d88da1bf60bf89283c9d3a65\": rpc error: code = NotFound desc = could not find container \"b6365958c0e82455e8fe1908fb727efdae94a402d88da1bf60bf89283c9d3a65\": container with ID starting with b6365958c0e82455e8fe1908fb727efdae94a402d88da1bf60bf89283c9d3a65 not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.091857 4941 scope.go:117] "RemoveContainer" containerID="c71a131c16d858150432793989349614354710b468996aed0a90a0a3b4655d57" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.092195 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c71a131c16d858150432793989349614354710b468996aed0a90a0a3b4655d57\": container with ID starting with c71a131c16d858150432793989349614354710b468996aed0a90a0a3b4655d57 not found: ID does not exist" containerID="c71a131c16d858150432793989349614354710b468996aed0a90a0a3b4655d57" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.092223 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c71a131c16d858150432793989349614354710b468996aed0a90a0a3b4655d57"} err="failed to get container status \"c71a131c16d858150432793989349614354710b468996aed0a90a0a3b4655d57\": rpc error: code = NotFound desc = could not find container \"c71a131c16d858150432793989349614354710b468996aed0a90a0a3b4655d57\": container with ID starting with c71a131c16d858150432793989349614354710b468996aed0a90a0a3b4655d57 not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.092245 4941 scope.go:117] "RemoveContainer" containerID="39628dd870570071b2bd7ae179bda95dfcba64ea2358713fb6ac97b67dd7f09c" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.092580 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39628dd870570071b2bd7ae179bda95dfcba64ea2358713fb6ac97b67dd7f09c\": container with ID starting with 39628dd870570071b2bd7ae179bda95dfcba64ea2358713fb6ac97b67dd7f09c not found: ID does not exist" containerID="39628dd870570071b2bd7ae179bda95dfcba64ea2358713fb6ac97b67dd7f09c" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.092611 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39628dd870570071b2bd7ae179bda95dfcba64ea2358713fb6ac97b67dd7f09c"} err="failed to get container status \"39628dd870570071b2bd7ae179bda95dfcba64ea2358713fb6ac97b67dd7f09c\": rpc error: code = NotFound desc = could not find container \"39628dd870570071b2bd7ae179bda95dfcba64ea2358713fb6ac97b67dd7f09c\": container with ID starting with 39628dd870570071b2bd7ae179bda95dfcba64ea2358713fb6ac97b67dd7f09c not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.092632 4941 scope.go:117] "RemoveContainer" containerID="b9f094df208cb97311346655039b847b135f62d9985a004099e08baeed2fee89" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.092966 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f094df208cb97311346655039b847b135f62d9985a004099e08baeed2fee89\": container with ID starting with b9f094df208cb97311346655039b847b135f62d9985a004099e08baeed2fee89 not found: ID does not exist" containerID="b9f094df208cb97311346655039b847b135f62d9985a004099e08baeed2fee89" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.092994 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f094df208cb97311346655039b847b135f62d9985a004099e08baeed2fee89"} err="failed to get container status \"b9f094df208cb97311346655039b847b135f62d9985a004099e08baeed2fee89\": rpc error: code = NotFound desc = could not find container \"b9f094df208cb97311346655039b847b135f62d9985a004099e08baeed2fee89\": container with ID starting with b9f094df208cb97311346655039b847b135f62d9985a004099e08baeed2fee89 not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.093011 4941 scope.go:117] "RemoveContainer" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.150126 4941 scope.go:117] "RemoveContainer" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.197624 4941 scope.go:117] "RemoveContainer" containerID="d7cbec1f305f6ebf20138346c84c10958909e785b858213cbc4a4dfa6f01f912" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.227546 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.234110 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.244810 4941 scope.go:117] "RemoveContainer" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.245286 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e\": container with ID starting with bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e not found: ID does not exist" containerID="bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.245424 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e"} err="failed to get container status \"bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e\": rpc error: code = NotFound desc = could not find container \"bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e\": container with ID starting with bc2ff2502680dcf3bd500138193f1a59a97699ab72d034d5e4131bc1d209b13e not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.245583 4941 scope.go:117] "RemoveContainer" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.248722 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd\": container with ID starting with cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd not found: ID does not exist" containerID="cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.248783 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd"} err="failed to get container status \"cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd\": rpc error: code = NotFound desc = could not find container \"cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd\": container with ID starting with cb0177293383dd13d3543c64d5baf16c27b29305dec6d27c000bb7e0a6f88cdd not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.248822 4941 scope.go:117] "RemoveContainer" containerID="d7cbec1f305f6ebf20138346c84c10958909e785b858213cbc4a4dfa6f01f912" Mar 07 07:17:41 crc kubenswrapper[4941]: E0307 07:17:41.249900 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7cbec1f305f6ebf20138346c84c10958909e785b858213cbc4a4dfa6f01f912\": container with ID starting with d7cbec1f305f6ebf20138346c84c10958909e785b858213cbc4a4dfa6f01f912 not found: ID does not exist" containerID="d7cbec1f305f6ebf20138346c84c10958909e785b858213cbc4a4dfa6f01f912" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.250021 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cbec1f305f6ebf20138346c84c10958909e785b858213cbc4a4dfa6f01f912"} err="failed to get container status \"d7cbec1f305f6ebf20138346c84c10958909e785b858213cbc4a4dfa6f01f912\": rpc error: code = NotFound desc = could not find container \"d7cbec1f305f6ebf20138346c84c10958909e785b858213cbc4a4dfa6f01f912\": container with ID starting with d7cbec1f305f6ebf20138346c84c10958909e785b858213cbc4a4dfa6f01f912 not found: ID does not exist" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.647723 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bk5cq"] Mar 07 07:17:41 crc kubenswrapper[4941]: W0307 07:17:41.649779 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1af0209_ed35_4a7e_9809_5bb513a32d8b.slice/crio-39cf9160e025ac2a2888808905b391210100b6f527b0442224544c8cec3b98f1 WatchSource:0}: Error finding container 39cf9160e025ac2a2888808905b391210100b6f527b0442224544c8cec3b98f1: Status 404 returned error can't find the container with id 39cf9160e025ac2a2888808905b391210100b6f527b0442224544c8cec3b98f1 Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.966590 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="531af2a1-d934-48a5-b3de-61d475bf252f" path="/var/lib/kubelet/pods/531af2a1-d934-48a5-b3de-61d475bf252f/volumes" Mar 07 07:17:41 crc kubenswrapper[4941]: I0307 07:17:41.967719 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a5f223a-7907-42a5-954b-fafc3c4b78da" path="/var/lib/kubelet/pods/6a5f223a-7907-42a5-954b-fafc3c4b78da/volumes" Mar 07 07:17:42 crc kubenswrapper[4941]: I0307 07:17:42.607822 4941 generic.go:334] "Generic (PLEG): container finished" podID="e1af0209-ed35-4a7e-9809-5bb513a32d8b" containerID="3303201d34bd28074f15060600a06105dd842a07142c95c5d6805700d5593f20" exitCode=0 Mar 07 07:17:42 crc kubenswrapper[4941]: I0307 07:17:42.607880 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bk5cq" event={"ID":"e1af0209-ed35-4a7e-9809-5bb513a32d8b","Type":"ContainerDied","Data":"3303201d34bd28074f15060600a06105dd842a07142c95c5d6805700d5593f20"} Mar 07 07:17:42 crc kubenswrapper[4941]: I0307 07:17:42.607915 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bk5cq" event={"ID":"e1af0209-ed35-4a7e-9809-5bb513a32d8b","Type":"ContainerStarted","Data":"39cf9160e025ac2a2888808905b391210100b6f527b0442224544c8cec3b98f1"} Mar 07 07:17:42 crc kubenswrapper[4941]: I0307 07:17:42.611319 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:17:43 crc kubenswrapper[4941]: I0307 07:17:43.323586 4941 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod317acc48-d39a-4c99-8a4e-ef91b0fc3894"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod317acc48-d39a-4c99-8a4e-ef91b0fc3894] : Timed out while waiting for systemd to remove kubepods-besteffort-pod317acc48_d39a_4c99_8a4e_ef91b0fc3894.slice" Mar 07 07:17:43 crc kubenswrapper[4941]: I0307 07:17:43.619901 4941 generic.go:334] "Generic (PLEG): container finished" podID="e1af0209-ed35-4a7e-9809-5bb513a32d8b" containerID="3dfbdcf1f86265bef65f1c80d11d2ad41d80c260783f31a184f9cba7ab14b458" exitCode=0 Mar 07 07:17:43 crc kubenswrapper[4941]: I0307 07:17:43.619959 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bk5cq" event={"ID":"e1af0209-ed35-4a7e-9809-5bb513a32d8b","Type":"ContainerDied","Data":"3dfbdcf1f86265bef65f1c80d11d2ad41d80c260783f31a184f9cba7ab14b458"} Mar 07 07:17:44 crc kubenswrapper[4941]: I0307 07:17:44.636474 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bk5cq" event={"ID":"e1af0209-ed35-4a7e-9809-5bb513a32d8b","Type":"ContainerStarted","Data":"14ec74b18d3e1130ac3fa6c326b21b7334b6cacebdaa52f49f7c362250599580"} Mar 07 07:17:44 crc kubenswrapper[4941]: I0307 07:17:44.659235 4941 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod53e374be-8342-42ac-a82a-75854d38e098"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod53e374be-8342-42ac-a82a-75854d38e098] : Timed out while waiting for systemd to remove kubepods-besteffort-pod53e374be_8342_42ac_a82a_75854d38e098.slice" Mar 07 07:17:44 crc kubenswrapper[4941]: E0307 07:17:44.659284 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod53e374be-8342-42ac-a82a-75854d38e098] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod53e374be-8342-42ac-a82a-75854d38e098] : Timed out while waiting for systemd to remove kubepods-besteffort-pod53e374be_8342_42ac_a82a_75854d38e098.slice" pod="openstack/nova-cell1-5379-account-create-update-6728g" podUID="53e374be-8342-42ac-a82a-75854d38e098" Mar 07 07:17:44 crc kubenswrapper[4941]: I0307 07:17:44.668121 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bk5cq" podStartSLOduration=3.224459405 podStartE2EDuration="4.668096981s" podCreationTimestamp="2026-03-07 07:17:40 +0000 UTC" firstStartedPulling="2026-03-07 07:17:42.611025805 +0000 UTC m=+1559.563391270" lastFinishedPulling="2026-03-07 07:17:44.054663361 +0000 UTC m=+1561.007028846" observedRunningTime="2026-03-07 07:17:44.653714346 +0000 UTC m=+1561.606079811" watchObservedRunningTime="2026-03-07 07:17:44.668096981 +0000 UTC m=+1561.620462446" Mar 07 07:17:44 crc kubenswrapper[4941]: I0307 07:17:44.680753 4941 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod05a91fa3-14f1-4d15-bdfc-bb1fc310a913"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod05a91fa3-14f1-4d15-bdfc-bb1fc310a913] : Timed out while waiting for systemd to remove kubepods-besteffort-pod05a91fa3_14f1_4d15_bdfc_bb1fc310a913.slice" Mar 07 07:17:44 crc kubenswrapper[4941]: E0307 07:17:44.680804 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod05a91fa3-14f1-4d15-bdfc-bb1fc310a913] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod05a91fa3-14f1-4d15-bdfc-bb1fc310a913] : Timed out while waiting for systemd to remove kubepods-besteffort-pod05a91fa3_14f1_4d15_bdfc_bb1fc310a913.slice" pod="openstack/nova-scheduler-0" podUID="05a91fa3-14f1-4d15-bdfc-bb1fc310a913" Mar 07 07:17:45 crc kubenswrapper[4941]: I0307 07:17:45.643980 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 07:17:45 crc kubenswrapper[4941]: I0307 07:17:45.644047 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5379-account-create-update-6728g" Mar 07 07:17:45 crc kubenswrapper[4941]: I0307 07:17:45.663697 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:17:45 crc kubenswrapper[4941]: I0307 07:17:45.668815 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 07:17:45 crc kubenswrapper[4941]: I0307 07:17:45.690744 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-5379-account-create-update-6728g"] Mar 07 07:17:45 crc kubenswrapper[4941]: I0307 07:17:45.696477 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-5379-account-create-update-6728g"] Mar 07 07:17:45 crc kubenswrapper[4941]: I0307 07:17:45.964167 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05a91fa3-14f1-4d15-bdfc-bb1fc310a913" path="/var/lib/kubelet/pods/05a91fa3-14f1-4d15-bdfc-bb1fc310a913/volumes" Mar 07 07:17:45 crc kubenswrapper[4941]: I0307 07:17:45.964956 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53e374be-8342-42ac-a82a-75854d38e098" path="/var/lib/kubelet/pods/53e374be-8342-42ac-a82a-75854d38e098/volumes" Mar 07 07:17:51 crc kubenswrapper[4941]: I0307 07:17:51.086009 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:51 crc kubenswrapper[4941]: I0307 07:17:51.086321 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:51 crc kubenswrapper[4941]: I0307 07:17:51.136425 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:51 crc kubenswrapper[4941]: I0307 07:17:51.778130 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:51 crc kubenswrapper[4941]: I0307 07:17:51.848660 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bk5cq"] Mar 07 07:17:53 crc kubenswrapper[4941]: I0307 07:17:53.732137 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bk5cq" podUID="e1af0209-ed35-4a7e-9809-5bb513a32d8b" containerName="registry-server" containerID="cri-o://14ec74b18d3e1130ac3fa6c326b21b7334b6cacebdaa52f49f7c362250599580" gracePeriod=2 Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.130677 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.279880 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1af0209-ed35-4a7e-9809-5bb513a32d8b-catalog-content\") pod \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\" (UID: \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\") " Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.279938 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl8j5\" (UniqueName: \"kubernetes.io/projected/e1af0209-ed35-4a7e-9809-5bb513a32d8b-kube-api-access-nl8j5\") pod \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\" (UID: \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\") " Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.280051 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1af0209-ed35-4a7e-9809-5bb513a32d8b-utilities\") pod \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\" (UID: \"e1af0209-ed35-4a7e-9809-5bb513a32d8b\") " Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.281121 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1af0209-ed35-4a7e-9809-5bb513a32d8b-utilities" (OuterVolumeSpecName: "utilities") pod "e1af0209-ed35-4a7e-9809-5bb513a32d8b" (UID: "e1af0209-ed35-4a7e-9809-5bb513a32d8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.285157 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1af0209-ed35-4a7e-9809-5bb513a32d8b-kube-api-access-nl8j5" (OuterVolumeSpecName: "kube-api-access-nl8j5") pod "e1af0209-ed35-4a7e-9809-5bb513a32d8b" (UID: "e1af0209-ed35-4a7e-9809-5bb513a32d8b"). InnerVolumeSpecName "kube-api-access-nl8j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.347899 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1af0209-ed35-4a7e-9809-5bb513a32d8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1af0209-ed35-4a7e-9809-5bb513a32d8b" (UID: "e1af0209-ed35-4a7e-9809-5bb513a32d8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.381533 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1af0209-ed35-4a7e-9809-5bb513a32d8b-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.381561 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1af0209-ed35-4a7e-9809-5bb513a32d8b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.381572 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl8j5\" (UniqueName: \"kubernetes.io/projected/e1af0209-ed35-4a7e-9809-5bb513a32d8b-kube-api-access-nl8j5\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.747520 4941 generic.go:334] "Generic (PLEG): container finished" podID="e1af0209-ed35-4a7e-9809-5bb513a32d8b" containerID="14ec74b18d3e1130ac3fa6c326b21b7334b6cacebdaa52f49f7c362250599580" exitCode=0 Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.747580 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bk5cq" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.747581 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bk5cq" event={"ID":"e1af0209-ed35-4a7e-9809-5bb513a32d8b","Type":"ContainerDied","Data":"14ec74b18d3e1130ac3fa6c326b21b7334b6cacebdaa52f49f7c362250599580"} Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.747646 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bk5cq" event={"ID":"e1af0209-ed35-4a7e-9809-5bb513a32d8b","Type":"ContainerDied","Data":"39cf9160e025ac2a2888808905b391210100b6f527b0442224544c8cec3b98f1"} Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.747675 4941 scope.go:117] "RemoveContainer" containerID="14ec74b18d3e1130ac3fa6c326b21b7334b6cacebdaa52f49f7c362250599580" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.781329 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bk5cq"] Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.786948 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bk5cq"] Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.793351 4941 scope.go:117] "RemoveContainer" containerID="3dfbdcf1f86265bef65f1c80d11d2ad41d80c260783f31a184f9cba7ab14b458" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.822188 4941 scope.go:117] "RemoveContainer" containerID="3303201d34bd28074f15060600a06105dd842a07142c95c5d6805700d5593f20" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.847142 4941 scope.go:117] "RemoveContainer" containerID="14ec74b18d3e1130ac3fa6c326b21b7334b6cacebdaa52f49f7c362250599580" Mar 07 07:17:54 crc kubenswrapper[4941]: E0307 07:17:54.847676 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14ec74b18d3e1130ac3fa6c326b21b7334b6cacebdaa52f49f7c362250599580\": container with ID starting with 14ec74b18d3e1130ac3fa6c326b21b7334b6cacebdaa52f49f7c362250599580 not found: ID does not exist" containerID="14ec74b18d3e1130ac3fa6c326b21b7334b6cacebdaa52f49f7c362250599580" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.847750 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14ec74b18d3e1130ac3fa6c326b21b7334b6cacebdaa52f49f7c362250599580"} err="failed to get container status \"14ec74b18d3e1130ac3fa6c326b21b7334b6cacebdaa52f49f7c362250599580\": rpc error: code = NotFound desc = could not find container \"14ec74b18d3e1130ac3fa6c326b21b7334b6cacebdaa52f49f7c362250599580\": container with ID starting with 14ec74b18d3e1130ac3fa6c326b21b7334b6cacebdaa52f49f7c362250599580 not found: ID does not exist" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.847799 4941 scope.go:117] "RemoveContainer" containerID="3dfbdcf1f86265bef65f1c80d11d2ad41d80c260783f31a184f9cba7ab14b458" Mar 07 07:17:54 crc kubenswrapper[4941]: E0307 07:17:54.848278 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dfbdcf1f86265bef65f1c80d11d2ad41d80c260783f31a184f9cba7ab14b458\": container with ID starting with 3dfbdcf1f86265bef65f1c80d11d2ad41d80c260783f31a184f9cba7ab14b458 not found: ID does not exist" containerID="3dfbdcf1f86265bef65f1c80d11d2ad41d80c260783f31a184f9cba7ab14b458" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.848333 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dfbdcf1f86265bef65f1c80d11d2ad41d80c260783f31a184f9cba7ab14b458"} err="failed to get container status \"3dfbdcf1f86265bef65f1c80d11d2ad41d80c260783f31a184f9cba7ab14b458\": rpc error: code = NotFound desc = could not find container \"3dfbdcf1f86265bef65f1c80d11d2ad41d80c260783f31a184f9cba7ab14b458\": container with ID starting with 3dfbdcf1f86265bef65f1c80d11d2ad41d80c260783f31a184f9cba7ab14b458 not found: ID does not exist" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.848370 4941 scope.go:117] "RemoveContainer" containerID="3303201d34bd28074f15060600a06105dd842a07142c95c5d6805700d5593f20" Mar 07 07:17:54 crc kubenswrapper[4941]: E0307 07:17:54.848977 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3303201d34bd28074f15060600a06105dd842a07142c95c5d6805700d5593f20\": container with ID starting with 3303201d34bd28074f15060600a06105dd842a07142c95c5d6805700d5593f20 not found: ID does not exist" containerID="3303201d34bd28074f15060600a06105dd842a07142c95c5d6805700d5593f20" Mar 07 07:17:54 crc kubenswrapper[4941]: I0307 07:17:54.849039 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3303201d34bd28074f15060600a06105dd842a07142c95c5d6805700d5593f20"} err="failed to get container status \"3303201d34bd28074f15060600a06105dd842a07142c95c5d6805700d5593f20\": rpc error: code = NotFound desc = could not find container \"3303201d34bd28074f15060600a06105dd842a07142c95c5d6805700d5593f20\": container with ID starting with 3303201d34bd28074f15060600a06105dd842a07142c95c5d6805700d5593f20 not found: ID does not exist" Mar 07 07:17:55 crc kubenswrapper[4941]: I0307 07:17:55.962087 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1af0209-ed35-4a7e-9809-5bb513a32d8b" path="/var/lib/kubelet/pods/e1af0209-ed35-4a7e-9809-5bb513a32d8b/volumes" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.139388 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547798-dmcvg"] Mar 07 07:18:00 crc kubenswrapper[4941]: E0307 07:18:00.139848 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1af0209-ed35-4a7e-9809-5bb513a32d8b" containerName="registry-server" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.139859 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1af0209-ed35-4a7e-9809-5bb513a32d8b" containerName="registry-server" Mar 07 07:18:00 crc kubenswrapper[4941]: E0307 07:18:00.139873 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1af0209-ed35-4a7e-9809-5bb513a32d8b" containerName="extract-content" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.139880 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1af0209-ed35-4a7e-9809-5bb513a32d8b" containerName="extract-content" Mar 07 07:18:00 crc kubenswrapper[4941]: E0307 07:18:00.139894 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1af0209-ed35-4a7e-9809-5bb513a32d8b" containerName="extract-utilities" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.139900 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1af0209-ed35-4a7e-9809-5bb513a32d8b" containerName="extract-utilities" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.140018 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1af0209-ed35-4a7e-9809-5bb513a32d8b" containerName="registry-server" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.140414 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547798-dmcvg" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.142982 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.143173 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.143232 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.149069 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547798-dmcvg"] Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.266166 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9lkb\" (UniqueName: \"kubernetes.io/projected/74764658-d199-4d6b-8d0f-b04f0839500c-kube-api-access-j9lkb\") pod \"auto-csr-approver-29547798-dmcvg\" (UID: \"74764658-d199-4d6b-8d0f-b04f0839500c\") " pod="openshift-infra/auto-csr-approver-29547798-dmcvg" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.368063 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9lkb\" (UniqueName: \"kubernetes.io/projected/74764658-d199-4d6b-8d0f-b04f0839500c-kube-api-access-j9lkb\") pod \"auto-csr-approver-29547798-dmcvg\" (UID: \"74764658-d199-4d6b-8d0f-b04f0839500c\") " pod="openshift-infra/auto-csr-approver-29547798-dmcvg" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.394894 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9lkb\" (UniqueName: \"kubernetes.io/projected/74764658-d199-4d6b-8d0f-b04f0839500c-kube-api-access-j9lkb\") pod \"auto-csr-approver-29547798-dmcvg\" (UID: \"74764658-d199-4d6b-8d0f-b04f0839500c\") " pod="openshift-infra/auto-csr-approver-29547798-dmcvg" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.455992 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547798-dmcvg" Mar 07 07:18:00 crc kubenswrapper[4941]: I0307 07:18:00.927939 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547798-dmcvg"] Mar 07 07:18:00 crc kubenswrapper[4941]: W0307 07:18:00.932713 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74764658_d199_4d6b_8d0f_b04f0839500c.slice/crio-2beb99d874cf80748ed6c741c6f0522706ab758ba3b9fd3d0bce7d530a6ef350 WatchSource:0}: Error finding container 2beb99d874cf80748ed6c741c6f0522706ab758ba3b9fd3d0bce7d530a6ef350: Status 404 returned error can't find the container with id 2beb99d874cf80748ed6c741c6f0522706ab758ba3b9fd3d0bce7d530a6ef350 Mar 07 07:18:01 crc kubenswrapper[4941]: I0307 07:18:01.815151 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547798-dmcvg" event={"ID":"74764658-d199-4d6b-8d0f-b04f0839500c","Type":"ContainerStarted","Data":"2beb99d874cf80748ed6c741c6f0522706ab758ba3b9fd3d0bce7d530a6ef350"} Mar 07 07:18:02 crc kubenswrapper[4941]: I0307 07:18:02.828114 4941 generic.go:334] "Generic (PLEG): container finished" podID="74764658-d199-4d6b-8d0f-b04f0839500c" containerID="874045616cd659190821ba7f859a7926099cc05a1035859abdb89bcd9b16bddb" exitCode=0 Mar 07 07:18:02 crc kubenswrapper[4941]: I0307 07:18:02.828199 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547798-dmcvg" event={"ID":"74764658-d199-4d6b-8d0f-b04f0839500c","Type":"ContainerDied","Data":"874045616cd659190821ba7f859a7926099cc05a1035859abdb89bcd9b16bddb"} Mar 07 07:18:04 crc kubenswrapper[4941]: I0307 07:18:04.220998 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547798-dmcvg" Mar 07 07:18:04 crc kubenswrapper[4941]: I0307 07:18:04.325057 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9lkb\" (UniqueName: \"kubernetes.io/projected/74764658-d199-4d6b-8d0f-b04f0839500c-kube-api-access-j9lkb\") pod \"74764658-d199-4d6b-8d0f-b04f0839500c\" (UID: \"74764658-d199-4d6b-8d0f-b04f0839500c\") " Mar 07 07:18:04 crc kubenswrapper[4941]: I0307 07:18:04.331642 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74764658-d199-4d6b-8d0f-b04f0839500c-kube-api-access-j9lkb" (OuterVolumeSpecName: "kube-api-access-j9lkb") pod "74764658-d199-4d6b-8d0f-b04f0839500c" (UID: "74764658-d199-4d6b-8d0f-b04f0839500c"). InnerVolumeSpecName "kube-api-access-j9lkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:18:04 crc kubenswrapper[4941]: I0307 07:18:04.427466 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9lkb\" (UniqueName: \"kubernetes.io/projected/74764658-d199-4d6b-8d0f-b04f0839500c-kube-api-access-j9lkb\") on node \"crc\" DevicePath \"\"" Mar 07 07:18:04 crc kubenswrapper[4941]: I0307 07:18:04.848462 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547798-dmcvg" event={"ID":"74764658-d199-4d6b-8d0f-b04f0839500c","Type":"ContainerDied","Data":"2beb99d874cf80748ed6c741c6f0522706ab758ba3b9fd3d0bce7d530a6ef350"} Mar 07 07:18:04 crc kubenswrapper[4941]: I0307 07:18:04.848502 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2beb99d874cf80748ed6c741c6f0522706ab758ba3b9fd3d0bce7d530a6ef350" Mar 07 07:18:04 crc kubenswrapper[4941]: I0307 07:18:04.848529 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547798-dmcvg" Mar 07 07:18:05 crc kubenswrapper[4941]: I0307 07:18:05.200224 4941 scope.go:117] "RemoveContainer" containerID="f29e9223a03236b2dc5bc79ce9ec2bcfcd1509d865de72b65c0643b5a816cb06" Mar 07 07:18:05 crc kubenswrapper[4941]: I0307 07:18:05.246704 4941 scope.go:117] "RemoveContainer" containerID="9f3a6b72f7f858b3900528d9ee4a3d8a16ca3a77b9b2764fdcf03670e1821e59" Mar 07 07:18:05 crc kubenswrapper[4941]: I0307 07:18:05.282994 4941 scope.go:117] "RemoveContainer" containerID="f9de7cd1754d7c2a737281737d75a1949c7b38949c7e6b30706e3ab775e70345" Mar 07 07:18:05 crc kubenswrapper[4941]: I0307 07:18:05.317496 4941 scope.go:117] "RemoveContainer" containerID="8fc805bc99c0c8af89d3b1cb58369ff1e706429e415c177a6ab724b2d108401f" Mar 07 07:18:05 crc kubenswrapper[4941]: I0307 07:18:05.327255 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547792-rq6d7"] Mar 07 07:18:05 crc kubenswrapper[4941]: I0307 07:18:05.335220 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547792-rq6d7"] Mar 07 07:18:05 crc kubenswrapper[4941]: I0307 07:18:05.360005 4941 scope.go:117] "RemoveContainer" containerID="11047d81854f9312c66dbdfc8c3e2ef41e2db704c8c78fd59294ed3fb616fe1c" Mar 07 07:18:05 crc kubenswrapper[4941]: I0307 07:18:05.387107 4941 scope.go:117] "RemoveContainer" containerID="91f78e3e58dc63c31d24b89ad1468484bedb657015f4630503f7c2e482731d6d" Mar 07 07:18:05 crc kubenswrapper[4941]: I0307 07:18:05.403369 4941 scope.go:117] "RemoveContainer" containerID="f261bcf1dd456507063c04b9d5073baa9a160e87a09f3c50d3c9076190c31770" Mar 07 07:18:05 crc kubenswrapper[4941]: I0307 07:18:05.418695 4941 scope.go:117] "RemoveContainer" containerID="56166620984b0f0801be41176baf5cfe9ea22ccae5626354a2b64a1dbe53dc43" Mar 07 07:18:05 crc kubenswrapper[4941]: I0307 07:18:05.435823 4941 scope.go:117] "RemoveContainer" containerID="2585f0f88683c18d6231df3938f2dc939959c3827623adfcebb1e2c4de47e762" Mar 07 07:18:05 crc kubenswrapper[4941]: I0307 07:18:05.962266 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03901d36-4348-43da-ad11-7592d9dd31e6" path="/var/lib/kubelet/pods/03901d36-4348-43da-ad11-7592d9dd31e6/volumes" Mar 07 07:18:10 crc kubenswrapper[4941]: I0307 07:18:10.314767 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:18:10 crc kubenswrapper[4941]: I0307 07:18:10.315092 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:18:10 crc kubenswrapper[4941]: I0307 07:18:10.315151 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 07:18:10 crc kubenswrapper[4941]: I0307 07:18:10.315920 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:18:10 crc kubenswrapper[4941]: I0307 07:18:10.316036 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" gracePeriod=600 Mar 07 07:18:10 crc kubenswrapper[4941]: E0307 07:18:10.457840 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:18:10 crc kubenswrapper[4941]: I0307 07:18:10.902632 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" exitCode=0 Mar 07 07:18:10 crc kubenswrapper[4941]: I0307 07:18:10.902672 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97"} Mar 07 07:18:10 crc kubenswrapper[4941]: I0307 07:18:10.902720 4941 scope.go:117] "RemoveContainer" containerID="275b7664a9752e3935f384cd42fa92a626ebe7af03267d645869fc5e152276f5" Mar 07 07:18:10 crc kubenswrapper[4941]: I0307 07:18:10.903243 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:18:10 crc kubenswrapper[4941]: E0307 07:18:10.903525 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:18:22 crc kubenswrapper[4941]: I0307 07:18:22.954039 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:18:22 crc kubenswrapper[4941]: E0307 07:18:22.954847 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:18:33 crc kubenswrapper[4941]: I0307 07:18:33.965788 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:18:33 crc kubenswrapper[4941]: E0307 07:18:33.966684 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:18:47 crc kubenswrapper[4941]: I0307 07:18:47.955185 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:18:47 crc kubenswrapper[4941]: E0307 07:18:47.956110 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:18:59 crc kubenswrapper[4941]: I0307 07:18:59.954445 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:18:59 crc kubenswrapper[4941]: E0307 07:18:59.955207 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.154273 4941 scope.go:117] "RemoveContainer" containerID="ed6ec49be7f1a8845083f19ad0eb4fcb07efc1ddd0b25370c3acd52ac6a40adb" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.218298 4941 scope.go:117] "RemoveContainer" containerID="869bc218596392f73c6fe9f035895dd8eea729b54c44ab37db7ecb0acfe66eae" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.254093 4941 scope.go:117] "RemoveContainer" containerID="ba7162468c4c70ab71e9a28be05ef6c3bf697ae5de5b59603a39b7e5b0531b86" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.276030 4941 scope.go:117] "RemoveContainer" containerID="b204fbe9ce1cfc32ef2454ccbe3f384c2a0a936bc042d09ea6997fe637820ed5" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.294520 4941 scope.go:117] "RemoveContainer" containerID="548daa53b60a430ed9122c44f5914f8f99a73a64e60a05630135d619b42368a0" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.315887 4941 scope.go:117] "RemoveContainer" containerID="73cfdb16aec3ceb4e0b85e78966b407c79934267edb526cc91fff8ba1abf81d4" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.347821 4941 scope.go:117] "RemoveContainer" containerID="240c2c14f09122e45dabd994d557fd833ae18604a37a74ced8f0772544c52251" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.370253 4941 scope.go:117] "RemoveContainer" containerID="242786fcb2ee6c81ed2471133631002d89ae5e5600992100b73f86fed708be4c" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.392693 4941 scope.go:117] "RemoveContainer" containerID="c7b5da34db23d169d9edebd23dd84191643d56cd40b78fafb5704b26e70e540c" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.410740 4941 scope.go:117] "RemoveContainer" containerID="db932bd212b934d5d0602e6a0bbf134bb38c6be7b68eb9e1378b335a513b1db6" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.426038 4941 scope.go:117] "RemoveContainer" containerID="771a6acb79466846fbbda26c235f8a9856350aa97ba28e19f837ce95c717bb48" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.456878 4941 scope.go:117] "RemoveContainer" containerID="b92f5de33b659f93ae7984e5c8bec08d59a4376d29eec81d883e98db00bb0c26" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.487918 4941 scope.go:117] "RemoveContainer" containerID="c1f4de81631f63d5d8bbd5141740771337f23ab59c7dbb591d707c033658d5ad" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.521226 4941 scope.go:117] "RemoveContainer" containerID="a81ee4ba569fd2814e50a034f3ae5d58e32374cb4584d0ccc23f214dc6519957" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.561917 4941 scope.go:117] "RemoveContainer" containerID="c24fae2d64b62c86483898cb09d66f911efa88a19a00118d0ad7bfa3f022fa48" Mar 07 07:19:06 crc kubenswrapper[4941]: I0307 07:19:06.601109 4941 scope.go:117] "RemoveContainer" containerID="81b976eb55b844a9710c1ad3933aa492f5bc903c7f6a05eb196b62de0c30c0c5" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.226434 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ndh79"] Mar 07 07:19:10 crc kubenswrapper[4941]: E0307 07:19:10.226997 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74764658-d199-4d6b-8d0f-b04f0839500c" containerName="oc" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.227010 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="74764658-d199-4d6b-8d0f-b04f0839500c" containerName="oc" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.227141 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="74764658-d199-4d6b-8d0f-b04f0839500c" containerName="oc" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.228054 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.256326 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndh79"] Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.399186 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112875b4-becc-4319-af2c-0a28322e74fe-utilities\") pod \"redhat-marketplace-ndh79\" (UID: \"112875b4-becc-4319-af2c-0a28322e74fe\") " pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.399258 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112875b4-becc-4319-af2c-0a28322e74fe-catalog-content\") pod \"redhat-marketplace-ndh79\" (UID: \"112875b4-becc-4319-af2c-0a28322e74fe\") " pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.399919 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh4t6\" (UniqueName: \"kubernetes.io/projected/112875b4-becc-4319-af2c-0a28322e74fe-kube-api-access-jh4t6\") pod \"redhat-marketplace-ndh79\" (UID: \"112875b4-becc-4319-af2c-0a28322e74fe\") " pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.501334 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh4t6\" (UniqueName: \"kubernetes.io/projected/112875b4-becc-4319-af2c-0a28322e74fe-kube-api-access-jh4t6\") pod \"redhat-marketplace-ndh79\" (UID: \"112875b4-becc-4319-af2c-0a28322e74fe\") " pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.501487 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112875b4-becc-4319-af2c-0a28322e74fe-utilities\") pod \"redhat-marketplace-ndh79\" (UID: \"112875b4-becc-4319-af2c-0a28322e74fe\") " pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.501591 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112875b4-becc-4319-af2c-0a28322e74fe-catalog-content\") pod \"redhat-marketplace-ndh79\" (UID: \"112875b4-becc-4319-af2c-0a28322e74fe\") " pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.502137 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112875b4-becc-4319-af2c-0a28322e74fe-utilities\") pod \"redhat-marketplace-ndh79\" (UID: \"112875b4-becc-4319-af2c-0a28322e74fe\") " pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.502197 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112875b4-becc-4319-af2c-0a28322e74fe-catalog-content\") pod \"redhat-marketplace-ndh79\" (UID: \"112875b4-becc-4319-af2c-0a28322e74fe\") " pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.521828 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh4t6\" (UniqueName: \"kubernetes.io/projected/112875b4-becc-4319-af2c-0a28322e74fe-kube-api-access-jh4t6\") pod \"redhat-marketplace-ndh79\" (UID: \"112875b4-becc-4319-af2c-0a28322e74fe\") " pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.545655 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:10 crc kubenswrapper[4941]: I0307 07:19:10.990100 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndh79"] Mar 07 07:19:11 crc kubenswrapper[4941]: I0307 07:19:11.490373 4941 generic.go:334] "Generic (PLEG): container finished" podID="112875b4-becc-4319-af2c-0a28322e74fe" containerID="58bf82e3b8243b66ba304be6edbb3b8905248fa2be51b27e46f3f150cfb14b90" exitCode=0 Mar 07 07:19:11 crc kubenswrapper[4941]: I0307 07:19:11.490473 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndh79" event={"ID":"112875b4-becc-4319-af2c-0a28322e74fe","Type":"ContainerDied","Data":"58bf82e3b8243b66ba304be6edbb3b8905248fa2be51b27e46f3f150cfb14b90"} Mar 07 07:19:11 crc kubenswrapper[4941]: I0307 07:19:11.490540 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndh79" event={"ID":"112875b4-becc-4319-af2c-0a28322e74fe","Type":"ContainerStarted","Data":"d01c729f5b38631e8eff7d85719d2f66821e153ac0e8e89573c040994912a394"} Mar 07 07:19:12 crc kubenswrapper[4941]: I0307 07:19:12.503007 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndh79" event={"ID":"112875b4-becc-4319-af2c-0a28322e74fe","Type":"ContainerStarted","Data":"ec2ae9949871d9ef6c25b290e701185b34c46b0ca2f35cc57b9641fa1eec1d76"} Mar 07 07:19:13 crc kubenswrapper[4941]: I0307 07:19:13.530585 4941 generic.go:334] "Generic (PLEG): container finished" podID="112875b4-becc-4319-af2c-0a28322e74fe" containerID="ec2ae9949871d9ef6c25b290e701185b34c46b0ca2f35cc57b9641fa1eec1d76" exitCode=0 Mar 07 07:19:13 crc kubenswrapper[4941]: I0307 07:19:13.530688 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndh79" event={"ID":"112875b4-becc-4319-af2c-0a28322e74fe","Type":"ContainerDied","Data":"ec2ae9949871d9ef6c25b290e701185b34c46b0ca2f35cc57b9641fa1eec1d76"} Mar 07 07:19:13 crc kubenswrapper[4941]: I0307 07:19:13.965510 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:19:13 crc kubenswrapper[4941]: E0307 07:19:13.965787 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:19:14 crc kubenswrapper[4941]: I0307 07:19:14.546055 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndh79" event={"ID":"112875b4-becc-4319-af2c-0a28322e74fe","Type":"ContainerStarted","Data":"1a5a174ce2e4d9099f6a92502d3fefe22f42da1046c0ed84eb2168138acd6a14"} Mar 07 07:19:14 crc kubenswrapper[4941]: I0307 07:19:14.575905 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ndh79" podStartSLOduration=2.096021392 podStartE2EDuration="4.575879621s" podCreationTimestamp="2026-03-07 07:19:10 +0000 UTC" firstStartedPulling="2026-03-07 07:19:11.492728148 +0000 UTC m=+1648.445093643" lastFinishedPulling="2026-03-07 07:19:13.972586407 +0000 UTC m=+1650.924951872" observedRunningTime="2026-03-07 07:19:14.570689729 +0000 UTC m=+1651.523055194" watchObservedRunningTime="2026-03-07 07:19:14.575879621 +0000 UTC m=+1651.528245116" Mar 07 07:19:20 crc kubenswrapper[4941]: I0307 07:19:20.546643 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:20 crc kubenswrapper[4941]: I0307 07:19:20.547182 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:20 crc kubenswrapper[4941]: I0307 07:19:20.626500 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:20 crc kubenswrapper[4941]: I0307 07:19:20.685529 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:20 crc kubenswrapper[4941]: I0307 07:19:20.870315 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndh79"] Mar 07 07:19:22 crc kubenswrapper[4941]: I0307 07:19:22.615789 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ndh79" podUID="112875b4-becc-4319-af2c-0a28322e74fe" containerName="registry-server" containerID="cri-o://1a5a174ce2e4d9099f6a92502d3fefe22f42da1046c0ed84eb2168138acd6a14" gracePeriod=2 Mar 07 07:19:22 crc kubenswrapper[4941]: I0307 07:19:22.955304 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.111379 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112875b4-becc-4319-af2c-0a28322e74fe-utilities\") pod \"112875b4-becc-4319-af2c-0a28322e74fe\" (UID: \"112875b4-becc-4319-af2c-0a28322e74fe\") " Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.111561 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112875b4-becc-4319-af2c-0a28322e74fe-catalog-content\") pod \"112875b4-becc-4319-af2c-0a28322e74fe\" (UID: \"112875b4-becc-4319-af2c-0a28322e74fe\") " Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.111622 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh4t6\" (UniqueName: \"kubernetes.io/projected/112875b4-becc-4319-af2c-0a28322e74fe-kube-api-access-jh4t6\") pod \"112875b4-becc-4319-af2c-0a28322e74fe\" (UID: \"112875b4-becc-4319-af2c-0a28322e74fe\") " Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.112732 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/112875b4-becc-4319-af2c-0a28322e74fe-utilities" (OuterVolumeSpecName: "utilities") pod "112875b4-becc-4319-af2c-0a28322e74fe" (UID: "112875b4-becc-4319-af2c-0a28322e74fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.121535 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/112875b4-becc-4319-af2c-0a28322e74fe-kube-api-access-jh4t6" (OuterVolumeSpecName: "kube-api-access-jh4t6") pod "112875b4-becc-4319-af2c-0a28322e74fe" (UID: "112875b4-becc-4319-af2c-0a28322e74fe"). InnerVolumeSpecName "kube-api-access-jh4t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.214361 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112875b4-becc-4319-af2c-0a28322e74fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.214438 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh4t6\" (UniqueName: \"kubernetes.io/projected/112875b4-becc-4319-af2c-0a28322e74fe-kube-api-access-jh4t6\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.253664 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/112875b4-becc-4319-af2c-0a28322e74fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "112875b4-becc-4319-af2c-0a28322e74fe" (UID: "112875b4-becc-4319-af2c-0a28322e74fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.315882 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112875b4-becc-4319-af2c-0a28322e74fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.626997 4941 generic.go:334] "Generic (PLEG): container finished" podID="112875b4-becc-4319-af2c-0a28322e74fe" containerID="1a5a174ce2e4d9099f6a92502d3fefe22f42da1046c0ed84eb2168138acd6a14" exitCode=0 Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.627049 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndh79" event={"ID":"112875b4-becc-4319-af2c-0a28322e74fe","Type":"ContainerDied","Data":"1a5a174ce2e4d9099f6a92502d3fefe22f42da1046c0ed84eb2168138acd6a14"} Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.627081 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ndh79" event={"ID":"112875b4-becc-4319-af2c-0a28322e74fe","Type":"ContainerDied","Data":"d01c729f5b38631e8eff7d85719d2f66821e153ac0e8e89573c040994912a394"} Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.627100 4941 scope.go:117] "RemoveContainer" containerID="1a5a174ce2e4d9099f6a92502d3fefe22f42da1046c0ed84eb2168138acd6a14" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.627115 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ndh79" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.670564 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndh79"] Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.676529 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ndh79"] Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.679730 4941 scope.go:117] "RemoveContainer" containerID="ec2ae9949871d9ef6c25b290e701185b34c46b0ca2f35cc57b9641fa1eec1d76" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.707245 4941 scope.go:117] "RemoveContainer" containerID="58bf82e3b8243b66ba304be6edbb3b8905248fa2be51b27e46f3f150cfb14b90" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.733502 4941 scope.go:117] "RemoveContainer" containerID="1a5a174ce2e4d9099f6a92502d3fefe22f42da1046c0ed84eb2168138acd6a14" Mar 07 07:19:23 crc kubenswrapper[4941]: E0307 07:19:23.734161 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5a174ce2e4d9099f6a92502d3fefe22f42da1046c0ed84eb2168138acd6a14\": container with ID starting with 1a5a174ce2e4d9099f6a92502d3fefe22f42da1046c0ed84eb2168138acd6a14 not found: ID does not exist" containerID="1a5a174ce2e4d9099f6a92502d3fefe22f42da1046c0ed84eb2168138acd6a14" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.734211 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5a174ce2e4d9099f6a92502d3fefe22f42da1046c0ed84eb2168138acd6a14"} err="failed to get container status \"1a5a174ce2e4d9099f6a92502d3fefe22f42da1046c0ed84eb2168138acd6a14\": rpc error: code = NotFound desc = could not find container \"1a5a174ce2e4d9099f6a92502d3fefe22f42da1046c0ed84eb2168138acd6a14\": container with ID starting with 1a5a174ce2e4d9099f6a92502d3fefe22f42da1046c0ed84eb2168138acd6a14 not found: ID does not exist" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.734240 4941 scope.go:117] "RemoveContainer" containerID="ec2ae9949871d9ef6c25b290e701185b34c46b0ca2f35cc57b9641fa1eec1d76" Mar 07 07:19:23 crc kubenswrapper[4941]: E0307 07:19:23.734928 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec2ae9949871d9ef6c25b290e701185b34c46b0ca2f35cc57b9641fa1eec1d76\": container with ID starting with ec2ae9949871d9ef6c25b290e701185b34c46b0ca2f35cc57b9641fa1eec1d76 not found: ID does not exist" containerID="ec2ae9949871d9ef6c25b290e701185b34c46b0ca2f35cc57b9641fa1eec1d76" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.735024 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2ae9949871d9ef6c25b290e701185b34c46b0ca2f35cc57b9641fa1eec1d76"} err="failed to get container status \"ec2ae9949871d9ef6c25b290e701185b34c46b0ca2f35cc57b9641fa1eec1d76\": rpc error: code = NotFound desc = could not find container \"ec2ae9949871d9ef6c25b290e701185b34c46b0ca2f35cc57b9641fa1eec1d76\": container with ID starting with ec2ae9949871d9ef6c25b290e701185b34c46b0ca2f35cc57b9641fa1eec1d76 not found: ID does not exist" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.735096 4941 scope.go:117] "RemoveContainer" containerID="58bf82e3b8243b66ba304be6edbb3b8905248fa2be51b27e46f3f150cfb14b90" Mar 07 07:19:23 crc kubenswrapper[4941]: E0307 07:19:23.735707 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58bf82e3b8243b66ba304be6edbb3b8905248fa2be51b27e46f3f150cfb14b90\": container with ID starting with 58bf82e3b8243b66ba304be6edbb3b8905248fa2be51b27e46f3f150cfb14b90 not found: ID does not exist" containerID="58bf82e3b8243b66ba304be6edbb3b8905248fa2be51b27e46f3f150cfb14b90" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.735752 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58bf82e3b8243b66ba304be6edbb3b8905248fa2be51b27e46f3f150cfb14b90"} err="failed to get container status \"58bf82e3b8243b66ba304be6edbb3b8905248fa2be51b27e46f3f150cfb14b90\": rpc error: code = NotFound desc = could not find container \"58bf82e3b8243b66ba304be6edbb3b8905248fa2be51b27e46f3f150cfb14b90\": container with ID starting with 58bf82e3b8243b66ba304be6edbb3b8905248fa2be51b27e46f3f150cfb14b90 not found: ID does not exist" Mar 07 07:19:23 crc kubenswrapper[4941]: I0307 07:19:23.973377 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="112875b4-becc-4319-af2c-0a28322e74fe" path="/var/lib/kubelet/pods/112875b4-becc-4319-af2c-0a28322e74fe/volumes" Mar 07 07:19:24 crc kubenswrapper[4941]: I0307 07:19:24.954311 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:19:24 crc kubenswrapper[4941]: E0307 07:19:24.954585 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:19:35 crc kubenswrapper[4941]: I0307 07:19:35.955587 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:19:35 crc kubenswrapper[4941]: E0307 07:19:35.956721 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:19:48 crc kubenswrapper[4941]: I0307 07:19:48.954753 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:19:48 crc kubenswrapper[4941]: E0307 07:19:48.955728 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:19:59 crc kubenswrapper[4941]: I0307 07:19:59.956171 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:19:59 crc kubenswrapper[4941]: E0307 07:19:59.957227 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.162487 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547800-gkbnj"] Mar 07 07:20:00 crc kubenswrapper[4941]: E0307 07:20:00.162843 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112875b4-becc-4319-af2c-0a28322e74fe" containerName="registry-server" Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.162865 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="112875b4-becc-4319-af2c-0a28322e74fe" containerName="registry-server" Mar 07 07:20:00 crc kubenswrapper[4941]: E0307 07:20:00.162896 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112875b4-becc-4319-af2c-0a28322e74fe" containerName="extract-content" Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.162904 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="112875b4-becc-4319-af2c-0a28322e74fe" containerName="extract-content" Mar 07 07:20:00 crc kubenswrapper[4941]: E0307 07:20:00.162920 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112875b4-becc-4319-af2c-0a28322e74fe" containerName="extract-utilities" Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.162928 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="112875b4-becc-4319-af2c-0a28322e74fe" containerName="extract-utilities" Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.163112 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="112875b4-becc-4319-af2c-0a28322e74fe" containerName="registry-server" Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.163703 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547800-gkbnj" Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.168373 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.169085 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.169855 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.185828 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547800-gkbnj"] Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.296132 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79pb4\" (UniqueName: \"kubernetes.io/projected/e29a8223-a33f-47bd-b4a6-6053f8b0af5a-kube-api-access-79pb4\") pod \"auto-csr-approver-29547800-gkbnj\" (UID: \"e29a8223-a33f-47bd-b4a6-6053f8b0af5a\") " pod="openshift-infra/auto-csr-approver-29547800-gkbnj" Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.397330 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79pb4\" (UniqueName: \"kubernetes.io/projected/e29a8223-a33f-47bd-b4a6-6053f8b0af5a-kube-api-access-79pb4\") pod \"auto-csr-approver-29547800-gkbnj\" (UID: \"e29a8223-a33f-47bd-b4a6-6053f8b0af5a\") " pod="openshift-infra/auto-csr-approver-29547800-gkbnj" Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.418478 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79pb4\" (UniqueName: \"kubernetes.io/projected/e29a8223-a33f-47bd-b4a6-6053f8b0af5a-kube-api-access-79pb4\") pod \"auto-csr-approver-29547800-gkbnj\" (UID: \"e29a8223-a33f-47bd-b4a6-6053f8b0af5a\") " pod="openshift-infra/auto-csr-approver-29547800-gkbnj" Mar 07 07:20:00 crc kubenswrapper[4941]: I0307 07:20:00.499250 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547800-gkbnj" Mar 07 07:20:01 crc kubenswrapper[4941]: I0307 07:20:01.031693 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547800-gkbnj"] Mar 07 07:20:02 crc kubenswrapper[4941]: I0307 07:20:02.002583 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547800-gkbnj" event={"ID":"e29a8223-a33f-47bd-b4a6-6053f8b0af5a","Type":"ContainerStarted","Data":"47b874fcb40dec22be771ce0db64a54b34a7b6ab718b9ffd223128061b2b32a5"} Mar 07 07:20:03 crc kubenswrapper[4941]: I0307 07:20:03.016361 4941 generic.go:334] "Generic (PLEG): container finished" podID="e29a8223-a33f-47bd-b4a6-6053f8b0af5a" containerID="9396222f588da070376b56bef1762286810c8fa1663143ad9c303bdfa91f38fc" exitCode=0 Mar 07 07:20:03 crc kubenswrapper[4941]: I0307 07:20:03.016468 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547800-gkbnj" event={"ID":"e29a8223-a33f-47bd-b4a6-6053f8b0af5a","Type":"ContainerDied","Data":"9396222f588da070376b56bef1762286810c8fa1663143ad9c303bdfa91f38fc"} Mar 07 07:20:04 crc kubenswrapper[4941]: I0307 07:20:04.338183 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547800-gkbnj" Mar 07 07:20:04 crc kubenswrapper[4941]: I0307 07:20:04.494478 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79pb4\" (UniqueName: \"kubernetes.io/projected/e29a8223-a33f-47bd-b4a6-6053f8b0af5a-kube-api-access-79pb4\") pod \"e29a8223-a33f-47bd-b4a6-6053f8b0af5a\" (UID: \"e29a8223-a33f-47bd-b4a6-6053f8b0af5a\") " Mar 07 07:20:04 crc kubenswrapper[4941]: I0307 07:20:04.501383 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29a8223-a33f-47bd-b4a6-6053f8b0af5a-kube-api-access-79pb4" (OuterVolumeSpecName: "kube-api-access-79pb4") pod "e29a8223-a33f-47bd-b4a6-6053f8b0af5a" (UID: "e29a8223-a33f-47bd-b4a6-6053f8b0af5a"). InnerVolumeSpecName "kube-api-access-79pb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:20:04 crc kubenswrapper[4941]: I0307 07:20:04.596635 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79pb4\" (UniqueName: \"kubernetes.io/projected/e29a8223-a33f-47bd-b4a6-6053f8b0af5a-kube-api-access-79pb4\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:05 crc kubenswrapper[4941]: I0307 07:20:05.034000 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547800-gkbnj" event={"ID":"e29a8223-a33f-47bd-b4a6-6053f8b0af5a","Type":"ContainerDied","Data":"47b874fcb40dec22be771ce0db64a54b34a7b6ab718b9ffd223128061b2b32a5"} Mar 07 07:20:05 crc kubenswrapper[4941]: I0307 07:20:05.034099 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47b874fcb40dec22be771ce0db64a54b34a7b6ab718b9ffd223128061b2b32a5" Mar 07 07:20:05 crc kubenswrapper[4941]: I0307 07:20:05.034058 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547800-gkbnj" Mar 07 07:20:05 crc kubenswrapper[4941]: I0307 07:20:05.437667 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547794-76jnh"] Mar 07 07:20:05 crc kubenswrapper[4941]: I0307 07:20:05.449043 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547794-76jnh"] Mar 07 07:20:05 crc kubenswrapper[4941]: I0307 07:20:05.966735 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb5331d0-b6ce-4295-8b61-ffb8f5425d7a" path="/var/lib/kubelet/pods/eb5331d0-b6ce-4295-8b61-ffb8f5425d7a/volumes" Mar 07 07:20:06 crc kubenswrapper[4941]: I0307 07:20:06.871272 4941 scope.go:117] "RemoveContainer" containerID="7309868f4caab95c79325c4137c9791aaa3b778c28a0d6e39b6d6ff175e4b90e" Mar 07 07:20:06 crc kubenswrapper[4941]: I0307 07:20:06.891015 4941 scope.go:117] "RemoveContainer" containerID="36e0c01fd1cc1fab82790d367c1f67d709d44426d74406326bf00d6f4c0369ff" Mar 07 07:20:06 crc kubenswrapper[4941]: I0307 07:20:06.913130 4941 scope.go:117] "RemoveContainer" containerID="e479eac33d253a468fe385bde1c5fe732f9ef0a1dd4b506f889b495235a2a7db" Mar 07 07:20:06 crc kubenswrapper[4941]: I0307 07:20:06.962638 4941 scope.go:117] "RemoveContainer" containerID="0125632d69c55a0569c4381d80b85512d72762ed0a3f3134b448fae40348ea6f" Mar 07 07:20:06 crc kubenswrapper[4941]: I0307 07:20:06.997678 4941 scope.go:117] "RemoveContainer" containerID="283944fffcc901ad82bbce7c77945f979ac62b8e6938453b65879e928c81ec34" Mar 07 07:20:07 crc kubenswrapper[4941]: I0307 07:20:07.019198 4941 scope.go:117] "RemoveContainer" containerID="2fe58906a6536fc9344008dd7883fbdd2d04d7c1f5a06fb56a2b123eda1a587c" Mar 07 07:20:07 crc kubenswrapper[4941]: I0307 07:20:07.058382 4941 scope.go:117] "RemoveContainer" containerID="22ffe531e01847af7ad108a5c553f268eb618deeba4f9532ec75229571a17c49" Mar 07 07:20:07 crc kubenswrapper[4941]: I0307 07:20:07.087350 4941 scope.go:117] "RemoveContainer" containerID="1cc00922fe10127ab4fdd98f27c6d3cc813448e5d7ee0455b1fa2bf47a5b5470" Mar 07 07:20:12 crc kubenswrapper[4941]: I0307 07:20:12.955035 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:20:12 crc kubenswrapper[4941]: E0307 07:20:12.956136 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:20:25 crc kubenswrapper[4941]: I0307 07:20:25.954967 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:20:25 crc kubenswrapper[4941]: E0307 07:20:25.955568 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:20:40 crc kubenswrapper[4941]: I0307 07:20:40.954927 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:20:40 crc kubenswrapper[4941]: E0307 07:20:40.955768 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:20:55 crc kubenswrapper[4941]: I0307 07:20:55.954385 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:20:55 crc kubenswrapper[4941]: E0307 07:20:55.955584 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:21:07 crc kubenswrapper[4941]: I0307 07:21:07.239293 4941 scope.go:117] "RemoveContainer" containerID="cee97226fd2fe2196abc3b2a74e837c3f4b06a16e8e4628edcae67b979c11f70" Mar 07 07:21:07 crc kubenswrapper[4941]: I0307 07:21:07.273689 4941 scope.go:117] "RemoveContainer" containerID="c31df48ec45a74d7943bce634439f98715ea8900b473d4e84dcbc0cbfcb889ec" Mar 07 07:21:07 crc kubenswrapper[4941]: I0307 07:21:07.322310 4941 scope.go:117] "RemoveContainer" containerID="ed43789861becd87eee81a4232f20de6afb6f8198fc9dd762f6924dee8e81bc0" Mar 07 07:21:07 crc kubenswrapper[4941]: I0307 07:21:07.362826 4941 scope.go:117] "RemoveContainer" containerID="9f3fb6ba0495b23cf3bd3f4a5cbc0981199748f055976e7d3a133030731c406e" Mar 07 07:21:07 crc kubenswrapper[4941]: I0307 07:21:07.383930 4941 scope.go:117] "RemoveContainer" containerID="9b3dcf3e98f5a12c294bf352756527783cac59ff8354db8088af9195cd7d4f5f" Mar 07 07:21:07 crc kubenswrapper[4941]: I0307 07:21:07.406846 4941 scope.go:117] "RemoveContainer" containerID="f8f7a812693ec4737e88de8877359cbe2542deb142efe695c7e5c2c6e6b81e86" Mar 07 07:21:07 crc kubenswrapper[4941]: I0307 07:21:07.432880 4941 scope.go:117] "RemoveContainer" containerID="5e1a08351cf1bf1c631dfbb909420b694b7c3ab021d1cb299980c16236829080" Mar 07 07:21:07 crc kubenswrapper[4941]: I0307 07:21:07.459290 4941 scope.go:117] "RemoveContainer" containerID="163550a6742880ef66e9f9a94958c2627b1e920b68844505b999087c43e31cbf" Mar 07 07:21:07 crc kubenswrapper[4941]: I0307 07:21:07.503086 4941 scope.go:117] "RemoveContainer" containerID="00a40b2d4c9f4455f85c402834216df4b345ea50bf85f4dc496c3575f40cc1f6" Mar 07 07:21:07 crc kubenswrapper[4941]: I0307 07:21:07.537492 4941 scope.go:117] "RemoveContainer" containerID="7151f2c385ba3425adbc17ab8a8c23b9f8502b8f68d9d7aa6b5c3ae6f0f0ee09" Mar 07 07:21:07 crc kubenswrapper[4941]: I0307 07:21:07.554539 4941 scope.go:117] "RemoveContainer" containerID="714d0156b3edde1cf179d33dedfa80bd0b65ef39151a20043484346d70977085" Mar 07 07:21:07 crc kubenswrapper[4941]: I0307 07:21:07.575669 4941 scope.go:117] "RemoveContainer" containerID="bbd647431dce8adc95d3e91bdfb368bc3cc17bc947f541ac45609fc6e7c16e1e" Mar 07 07:21:07 crc kubenswrapper[4941]: I0307 07:21:07.605837 4941 scope.go:117] "RemoveContainer" containerID="476fb5c632c42dcf296ba56237913854497fa76de1766c6d57e69b4168f8a311" Mar 07 07:21:10 crc kubenswrapper[4941]: I0307 07:21:10.954966 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:21:10 crc kubenswrapper[4941]: E0307 07:21:10.955607 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:21:21 crc kubenswrapper[4941]: I0307 07:21:21.954165 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:21:21 crc kubenswrapper[4941]: E0307 07:21:21.955041 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:21:33 crc kubenswrapper[4941]: I0307 07:21:33.967872 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:21:33 crc kubenswrapper[4941]: E0307 07:21:33.969219 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:21:44 crc kubenswrapper[4941]: I0307 07:21:44.954466 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:21:44 crc kubenswrapper[4941]: E0307 07:21:44.955569 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:21:56 crc kubenswrapper[4941]: I0307 07:21:56.954724 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:21:56 crc kubenswrapper[4941]: E0307 07:21:56.955976 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:22:00 crc kubenswrapper[4941]: I0307 07:22:00.153582 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547802-tfzjw"] Mar 07 07:22:00 crc kubenswrapper[4941]: E0307 07:22:00.154264 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29a8223-a33f-47bd-b4a6-6053f8b0af5a" containerName="oc" Mar 07 07:22:00 crc kubenswrapper[4941]: I0307 07:22:00.154280 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29a8223-a33f-47bd-b4a6-6053f8b0af5a" containerName="oc" Mar 07 07:22:00 crc kubenswrapper[4941]: I0307 07:22:00.154485 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29a8223-a33f-47bd-b4a6-6053f8b0af5a" containerName="oc" Mar 07 07:22:00 crc kubenswrapper[4941]: I0307 07:22:00.155023 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547802-tfzjw" Mar 07 07:22:00 crc kubenswrapper[4941]: I0307 07:22:00.157152 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:22:00 crc kubenswrapper[4941]: I0307 07:22:00.157675 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:22:00 crc kubenswrapper[4941]: I0307 07:22:00.159310 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:22:00 crc kubenswrapper[4941]: I0307 07:22:00.172182 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547802-tfzjw"] Mar 07 07:22:00 crc kubenswrapper[4941]: I0307 07:22:00.190131 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvtln\" (UniqueName: \"kubernetes.io/projected/a7fcce16-6076-4b6b-af87-6d693670242c-kube-api-access-gvtln\") pod \"auto-csr-approver-29547802-tfzjw\" (UID: \"a7fcce16-6076-4b6b-af87-6d693670242c\") " pod="openshift-infra/auto-csr-approver-29547802-tfzjw" Mar 07 07:22:00 crc kubenswrapper[4941]: I0307 07:22:00.291190 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvtln\" (UniqueName: \"kubernetes.io/projected/a7fcce16-6076-4b6b-af87-6d693670242c-kube-api-access-gvtln\") pod \"auto-csr-approver-29547802-tfzjw\" (UID: \"a7fcce16-6076-4b6b-af87-6d693670242c\") " pod="openshift-infra/auto-csr-approver-29547802-tfzjw" Mar 07 07:22:00 crc kubenswrapper[4941]: I0307 07:22:00.312263 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvtln\" (UniqueName: \"kubernetes.io/projected/a7fcce16-6076-4b6b-af87-6d693670242c-kube-api-access-gvtln\") pod \"auto-csr-approver-29547802-tfzjw\" (UID: \"a7fcce16-6076-4b6b-af87-6d693670242c\") " pod="openshift-infra/auto-csr-approver-29547802-tfzjw" Mar 07 07:22:00 crc kubenswrapper[4941]: I0307 07:22:00.483073 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547802-tfzjw" Mar 07 07:22:01 crc kubenswrapper[4941]: I0307 07:22:01.013326 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547802-tfzjw"] Mar 07 07:22:01 crc kubenswrapper[4941]: I0307 07:22:01.189312 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547802-tfzjw" event={"ID":"a7fcce16-6076-4b6b-af87-6d693670242c","Type":"ContainerStarted","Data":"eaf2e5bcfa7018ba350aaa6d61000e91d67fad5fadc8c3415de93957907891db"} Mar 07 07:22:02 crc kubenswrapper[4941]: I0307 07:22:02.200696 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547802-tfzjw" event={"ID":"a7fcce16-6076-4b6b-af87-6d693670242c","Type":"ContainerStarted","Data":"04fb31e637ed76e095c99f1a40c4aac6c7bc397114c1ff9051c6fa3b7cb845d3"} Mar 07 07:22:02 crc kubenswrapper[4941]: I0307 07:22:02.218914 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547802-tfzjw" podStartSLOduration=1.394158675 podStartE2EDuration="2.218895013s" podCreationTimestamp="2026-03-07 07:22:00 +0000 UTC" firstStartedPulling="2026-03-07 07:22:01.01806807 +0000 UTC m=+1817.970433555" lastFinishedPulling="2026-03-07 07:22:01.842804408 +0000 UTC m=+1818.795169893" observedRunningTime="2026-03-07 07:22:02.216886392 +0000 UTC m=+1819.169251897" watchObservedRunningTime="2026-03-07 07:22:02.218895013 +0000 UTC m=+1819.171260478" Mar 07 07:22:03 crc kubenswrapper[4941]: I0307 07:22:03.212113 4941 generic.go:334] "Generic (PLEG): container finished" podID="a7fcce16-6076-4b6b-af87-6d693670242c" containerID="04fb31e637ed76e095c99f1a40c4aac6c7bc397114c1ff9051c6fa3b7cb845d3" exitCode=0 Mar 07 07:22:03 crc kubenswrapper[4941]: I0307 07:22:03.212391 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547802-tfzjw" event={"ID":"a7fcce16-6076-4b6b-af87-6d693670242c","Type":"ContainerDied","Data":"04fb31e637ed76e095c99f1a40c4aac6c7bc397114c1ff9051c6fa3b7cb845d3"} Mar 07 07:22:04 crc kubenswrapper[4941]: I0307 07:22:04.603977 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547802-tfzjw" Mar 07 07:22:04 crc kubenswrapper[4941]: I0307 07:22:04.757364 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvtln\" (UniqueName: \"kubernetes.io/projected/a7fcce16-6076-4b6b-af87-6d693670242c-kube-api-access-gvtln\") pod \"a7fcce16-6076-4b6b-af87-6d693670242c\" (UID: \"a7fcce16-6076-4b6b-af87-6d693670242c\") " Mar 07 07:22:04 crc kubenswrapper[4941]: I0307 07:22:04.763913 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7fcce16-6076-4b6b-af87-6d693670242c-kube-api-access-gvtln" (OuterVolumeSpecName: "kube-api-access-gvtln") pod "a7fcce16-6076-4b6b-af87-6d693670242c" (UID: "a7fcce16-6076-4b6b-af87-6d693670242c"). InnerVolumeSpecName "kube-api-access-gvtln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:22:04 crc kubenswrapper[4941]: I0307 07:22:04.858841 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvtln\" (UniqueName: \"kubernetes.io/projected/a7fcce16-6076-4b6b-af87-6d693670242c-kube-api-access-gvtln\") on node \"crc\" DevicePath \"\"" Mar 07 07:22:05 crc kubenswrapper[4941]: I0307 07:22:05.234695 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547802-tfzjw" event={"ID":"a7fcce16-6076-4b6b-af87-6d693670242c","Type":"ContainerDied","Data":"eaf2e5bcfa7018ba350aaa6d61000e91d67fad5fadc8c3415de93957907891db"} Mar 07 07:22:05 crc kubenswrapper[4941]: I0307 07:22:05.235290 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaf2e5bcfa7018ba350aaa6d61000e91d67fad5fadc8c3415de93957907891db" Mar 07 07:22:05 crc kubenswrapper[4941]: I0307 07:22:05.234766 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547802-tfzjw" Mar 07 07:22:05 crc kubenswrapper[4941]: I0307 07:22:05.294588 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547796-5ncn7"] Mar 07 07:22:05 crc kubenswrapper[4941]: I0307 07:22:05.299392 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547796-5ncn7"] Mar 07 07:22:05 crc kubenswrapper[4941]: I0307 07:22:05.963220 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5923db9b-eb36-4b13-a0e5-78ba9c2017b1" path="/var/lib/kubelet/pods/5923db9b-eb36-4b13-a0e5-78ba9c2017b1/volumes" Mar 07 07:22:07 crc kubenswrapper[4941]: I0307 07:22:07.819736 4941 scope.go:117] "RemoveContainer" containerID="9ba176036dee7f5c9afc1e0e8e5a89d8cb4fcf2342760faaa6466e404e2b45ac" Mar 07 07:22:07 crc kubenswrapper[4941]: I0307 07:22:07.870469 4941 scope.go:117] "RemoveContainer" containerID="4dfaa1caa1c945cdadc6069017eb59bb909a3a544cec38ebaf0cb8113f6668da" Mar 07 07:22:09 crc kubenswrapper[4941]: I0307 07:22:09.954897 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:22:09 crc kubenswrapper[4941]: E0307 07:22:09.955946 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:22:24 crc kubenswrapper[4941]: I0307 07:22:24.954534 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:22:24 crc kubenswrapper[4941]: E0307 07:22:24.955082 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:22:39 crc kubenswrapper[4941]: I0307 07:22:39.954601 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:22:39 crc kubenswrapper[4941]: E0307 07:22:39.955366 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:22:53 crc kubenswrapper[4941]: I0307 07:22:53.966692 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:22:53 crc kubenswrapper[4941]: E0307 07:22:53.967934 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:23:08 crc kubenswrapper[4941]: I0307 07:23:08.954684 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:23:08 crc kubenswrapper[4941]: E0307 07:23:08.955709 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:23:19 crc kubenswrapper[4941]: I0307 07:23:19.954661 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:23:20 crc kubenswrapper[4941]: I0307 07:23:20.885918 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"7a88073c6303a571ed602b8eda7126dc414ea6424a5b0973cb131c83c8213e24"} Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.146769 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t2kcg"] Mar 07 07:23:37 crc kubenswrapper[4941]: E0307 07:23:37.148017 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7fcce16-6076-4b6b-af87-6d693670242c" containerName="oc" Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.148033 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fcce16-6076-4b6b-af87-6d693670242c" containerName="oc" Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.148208 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7fcce16-6076-4b6b-af87-6d693670242c" containerName="oc" Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.149352 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.164131 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t2kcg"] Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.320085 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z94h\" (UniqueName: \"kubernetes.io/projected/17a0122b-d8af-4ab6-b937-40f5cecf86b6-kube-api-access-6z94h\") pod \"certified-operators-t2kcg\" (UID: \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\") " pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.320248 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17a0122b-d8af-4ab6-b937-40f5cecf86b6-catalog-content\") pod \"certified-operators-t2kcg\" (UID: \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\") " pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.320284 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17a0122b-d8af-4ab6-b937-40f5cecf86b6-utilities\") pod \"certified-operators-t2kcg\" (UID: \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\") " pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.421666 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z94h\" (UniqueName: \"kubernetes.io/projected/17a0122b-d8af-4ab6-b937-40f5cecf86b6-kube-api-access-6z94h\") pod \"certified-operators-t2kcg\" (UID: \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\") " pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.421791 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17a0122b-d8af-4ab6-b937-40f5cecf86b6-catalog-content\") pod \"certified-operators-t2kcg\" (UID: \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\") " pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.421825 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17a0122b-d8af-4ab6-b937-40f5cecf86b6-utilities\") pod \"certified-operators-t2kcg\" (UID: \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\") " pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.422572 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17a0122b-d8af-4ab6-b937-40f5cecf86b6-utilities\") pod \"certified-operators-t2kcg\" (UID: \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\") " pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.422627 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17a0122b-d8af-4ab6-b937-40f5cecf86b6-catalog-content\") pod \"certified-operators-t2kcg\" (UID: \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\") " pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.445871 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z94h\" (UniqueName: \"kubernetes.io/projected/17a0122b-d8af-4ab6-b937-40f5cecf86b6-kube-api-access-6z94h\") pod \"certified-operators-t2kcg\" (UID: \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\") " pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:37 crc kubenswrapper[4941]: I0307 07:23:37.487775 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:38 crc kubenswrapper[4941]: I0307 07:23:38.010621 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t2kcg"] Mar 07 07:23:38 crc kubenswrapper[4941]: I0307 07:23:38.077624 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2kcg" event={"ID":"17a0122b-d8af-4ab6-b937-40f5cecf86b6","Type":"ContainerStarted","Data":"23b6dd6cf7035e2c7830aa4bddab70296199c29377efbea528f04ae428ec3134"} Mar 07 07:23:39 crc kubenswrapper[4941]: I0307 07:23:39.091201 4941 generic.go:334] "Generic (PLEG): container finished" podID="17a0122b-d8af-4ab6-b937-40f5cecf86b6" containerID="fe7ca2c5f01d41a8f02bfb2551428c55486db3db07b003ecba783fccb1a3d6af" exitCode=0 Mar 07 07:23:39 crc kubenswrapper[4941]: I0307 07:23:39.091288 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2kcg" event={"ID":"17a0122b-d8af-4ab6-b937-40f5cecf86b6","Type":"ContainerDied","Data":"fe7ca2c5f01d41a8f02bfb2551428c55486db3db07b003ecba783fccb1a3d6af"} Mar 07 07:23:39 crc kubenswrapper[4941]: I0307 07:23:39.094174 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:23:40 crc kubenswrapper[4941]: I0307 07:23:40.099692 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2kcg" event={"ID":"17a0122b-d8af-4ab6-b937-40f5cecf86b6","Type":"ContainerStarted","Data":"be896bf69bee61a0814da80273466910365bd9d9721a7415298d8b8bd658ad82"} Mar 07 07:23:41 crc kubenswrapper[4941]: I0307 07:23:41.106561 4941 generic.go:334] "Generic (PLEG): container finished" podID="17a0122b-d8af-4ab6-b937-40f5cecf86b6" containerID="be896bf69bee61a0814da80273466910365bd9d9721a7415298d8b8bd658ad82" exitCode=0 Mar 07 07:23:41 crc kubenswrapper[4941]: I0307 07:23:41.106648 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2kcg" event={"ID":"17a0122b-d8af-4ab6-b937-40f5cecf86b6","Type":"ContainerDied","Data":"be896bf69bee61a0814da80273466910365bd9d9721a7415298d8b8bd658ad82"} Mar 07 07:23:42 crc kubenswrapper[4941]: I0307 07:23:42.116372 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2kcg" event={"ID":"17a0122b-d8af-4ab6-b937-40f5cecf86b6","Type":"ContainerStarted","Data":"dffa7a09a9a0b3c63fc47b3c726bf075d2154a2a885fdecef7c75df9c2f25eb1"} Mar 07 07:23:42 crc kubenswrapper[4941]: I0307 07:23:42.137504 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t2kcg" podStartSLOduration=2.70882734 podStartE2EDuration="5.137480885s" podCreationTimestamp="2026-03-07 07:23:37 +0000 UTC" firstStartedPulling="2026-03-07 07:23:39.093871852 +0000 UTC m=+1916.046237327" lastFinishedPulling="2026-03-07 07:23:41.522525367 +0000 UTC m=+1918.474890872" observedRunningTime="2026-03-07 07:23:42.133340259 +0000 UTC m=+1919.085705744" watchObservedRunningTime="2026-03-07 07:23:42.137480885 +0000 UTC m=+1919.089846370" Mar 07 07:23:47 crc kubenswrapper[4941]: I0307 07:23:47.488946 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:47 crc kubenswrapper[4941]: I0307 07:23:47.489460 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:47 crc kubenswrapper[4941]: I0307 07:23:47.552953 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:48 crc kubenswrapper[4941]: I0307 07:23:48.205902 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:48 crc kubenswrapper[4941]: I0307 07:23:48.259568 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t2kcg"] Mar 07 07:23:50 crc kubenswrapper[4941]: I0307 07:23:50.185650 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t2kcg" podUID="17a0122b-d8af-4ab6-b937-40f5cecf86b6" containerName="registry-server" containerID="cri-o://dffa7a09a9a0b3c63fc47b3c726bf075d2154a2a885fdecef7c75df9c2f25eb1" gracePeriod=2 Mar 07 07:23:50 crc kubenswrapper[4941]: I0307 07:23:50.630470 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:50 crc kubenswrapper[4941]: I0307 07:23:50.744004 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17a0122b-d8af-4ab6-b937-40f5cecf86b6-utilities\") pod \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\" (UID: \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\") " Mar 07 07:23:50 crc kubenswrapper[4941]: I0307 07:23:50.744044 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17a0122b-d8af-4ab6-b937-40f5cecf86b6-catalog-content\") pod \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\" (UID: \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\") " Mar 07 07:23:50 crc kubenswrapper[4941]: I0307 07:23:50.744090 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z94h\" (UniqueName: \"kubernetes.io/projected/17a0122b-d8af-4ab6-b937-40f5cecf86b6-kube-api-access-6z94h\") pod \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\" (UID: \"17a0122b-d8af-4ab6-b937-40f5cecf86b6\") " Mar 07 07:23:50 crc kubenswrapper[4941]: I0307 07:23:50.745031 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17a0122b-d8af-4ab6-b937-40f5cecf86b6-utilities" (OuterVolumeSpecName: "utilities") pod "17a0122b-d8af-4ab6-b937-40f5cecf86b6" (UID: "17a0122b-d8af-4ab6-b937-40f5cecf86b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:23:50 crc kubenswrapper[4941]: I0307 07:23:50.750625 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a0122b-d8af-4ab6-b937-40f5cecf86b6-kube-api-access-6z94h" (OuterVolumeSpecName: "kube-api-access-6z94h") pod "17a0122b-d8af-4ab6-b937-40f5cecf86b6" (UID: "17a0122b-d8af-4ab6-b937-40f5cecf86b6"). InnerVolumeSpecName "kube-api-access-6z94h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:23:50 crc kubenswrapper[4941]: I0307 07:23:50.845714 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17a0122b-d8af-4ab6-b937-40f5cecf86b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:23:50 crc kubenswrapper[4941]: I0307 07:23:50.845818 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z94h\" (UniqueName: \"kubernetes.io/projected/17a0122b-d8af-4ab6-b937-40f5cecf86b6-kube-api-access-6z94h\") on node \"crc\" DevicePath \"\"" Mar 07 07:23:50 crc kubenswrapper[4941]: I0307 07:23:50.921950 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17a0122b-d8af-4ab6-b937-40f5cecf86b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17a0122b-d8af-4ab6-b937-40f5cecf86b6" (UID: "17a0122b-d8af-4ab6-b937-40f5cecf86b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:23:50 crc kubenswrapper[4941]: I0307 07:23:50.947968 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17a0122b-d8af-4ab6-b937-40f5cecf86b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.195921 4941 generic.go:334] "Generic (PLEG): container finished" podID="17a0122b-d8af-4ab6-b937-40f5cecf86b6" containerID="dffa7a09a9a0b3c63fc47b3c726bf075d2154a2a885fdecef7c75df9c2f25eb1" exitCode=0 Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.195980 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2kcg" event={"ID":"17a0122b-d8af-4ab6-b937-40f5cecf86b6","Type":"ContainerDied","Data":"dffa7a09a9a0b3c63fc47b3c726bf075d2154a2a885fdecef7c75df9c2f25eb1"} Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.196011 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t2kcg" Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.196062 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2kcg" event={"ID":"17a0122b-d8af-4ab6-b937-40f5cecf86b6","Type":"ContainerDied","Data":"23b6dd6cf7035e2c7830aa4bddab70296199c29377efbea528f04ae428ec3134"} Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.196165 4941 scope.go:117] "RemoveContainer" containerID="dffa7a09a9a0b3c63fc47b3c726bf075d2154a2a885fdecef7c75df9c2f25eb1" Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.230858 4941 scope.go:117] "RemoveContainer" containerID="be896bf69bee61a0814da80273466910365bd9d9721a7415298d8b8bd658ad82" Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.237911 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t2kcg"] Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.242243 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t2kcg"] Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.263583 4941 scope.go:117] "RemoveContainer" containerID="fe7ca2c5f01d41a8f02bfb2551428c55486db3db07b003ecba783fccb1a3d6af" Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.285696 4941 scope.go:117] "RemoveContainer" containerID="dffa7a09a9a0b3c63fc47b3c726bf075d2154a2a885fdecef7c75df9c2f25eb1" Mar 07 07:23:51 crc kubenswrapper[4941]: E0307 07:23:51.286124 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dffa7a09a9a0b3c63fc47b3c726bf075d2154a2a885fdecef7c75df9c2f25eb1\": container with ID starting with dffa7a09a9a0b3c63fc47b3c726bf075d2154a2a885fdecef7c75df9c2f25eb1 not found: ID does not exist" containerID="dffa7a09a9a0b3c63fc47b3c726bf075d2154a2a885fdecef7c75df9c2f25eb1" Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.286216 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dffa7a09a9a0b3c63fc47b3c726bf075d2154a2a885fdecef7c75df9c2f25eb1"} err="failed to get container status \"dffa7a09a9a0b3c63fc47b3c726bf075d2154a2a885fdecef7c75df9c2f25eb1\": rpc error: code = NotFound desc = could not find container \"dffa7a09a9a0b3c63fc47b3c726bf075d2154a2a885fdecef7c75df9c2f25eb1\": container with ID starting with dffa7a09a9a0b3c63fc47b3c726bf075d2154a2a885fdecef7c75df9c2f25eb1 not found: ID does not exist" Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.286301 4941 scope.go:117] "RemoveContainer" containerID="be896bf69bee61a0814da80273466910365bd9d9721a7415298d8b8bd658ad82" Mar 07 07:23:51 crc kubenswrapper[4941]: E0307 07:23:51.286856 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be896bf69bee61a0814da80273466910365bd9d9721a7415298d8b8bd658ad82\": container with ID starting with be896bf69bee61a0814da80273466910365bd9d9721a7415298d8b8bd658ad82 not found: ID does not exist" containerID="be896bf69bee61a0814da80273466910365bd9d9721a7415298d8b8bd658ad82" Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.286898 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be896bf69bee61a0814da80273466910365bd9d9721a7415298d8b8bd658ad82"} err="failed to get container status \"be896bf69bee61a0814da80273466910365bd9d9721a7415298d8b8bd658ad82\": rpc error: code = NotFound desc = could not find container \"be896bf69bee61a0814da80273466910365bd9d9721a7415298d8b8bd658ad82\": container with ID starting with be896bf69bee61a0814da80273466910365bd9d9721a7415298d8b8bd658ad82 not found: ID does not exist" Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.286925 4941 scope.go:117] "RemoveContainer" containerID="fe7ca2c5f01d41a8f02bfb2551428c55486db3db07b003ecba783fccb1a3d6af" Mar 07 07:23:51 crc kubenswrapper[4941]: E0307 07:23:51.287447 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe7ca2c5f01d41a8f02bfb2551428c55486db3db07b003ecba783fccb1a3d6af\": container with ID starting with fe7ca2c5f01d41a8f02bfb2551428c55486db3db07b003ecba783fccb1a3d6af not found: ID does not exist" containerID="fe7ca2c5f01d41a8f02bfb2551428c55486db3db07b003ecba783fccb1a3d6af" Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.287507 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7ca2c5f01d41a8f02bfb2551428c55486db3db07b003ecba783fccb1a3d6af"} err="failed to get container status \"fe7ca2c5f01d41a8f02bfb2551428c55486db3db07b003ecba783fccb1a3d6af\": rpc error: code = NotFound desc = could not find container \"fe7ca2c5f01d41a8f02bfb2551428c55486db3db07b003ecba783fccb1a3d6af\": container with ID starting with fe7ca2c5f01d41a8f02bfb2551428c55486db3db07b003ecba783fccb1a3d6af not found: ID does not exist" Mar 07 07:23:51 crc kubenswrapper[4941]: I0307 07:23:51.968221 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17a0122b-d8af-4ab6-b937-40f5cecf86b6" path="/var/lib/kubelet/pods/17a0122b-d8af-4ab6-b937-40f5cecf86b6/volumes" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.150299 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547804-hkn5q"] Mar 07 07:24:00 crc kubenswrapper[4941]: E0307 07:24:00.151252 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a0122b-d8af-4ab6-b937-40f5cecf86b6" containerName="extract-content" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.151270 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a0122b-d8af-4ab6-b937-40f5cecf86b6" containerName="extract-content" Mar 07 07:24:00 crc kubenswrapper[4941]: E0307 07:24:00.151286 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a0122b-d8af-4ab6-b937-40f5cecf86b6" containerName="extract-utilities" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.151294 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a0122b-d8af-4ab6-b937-40f5cecf86b6" containerName="extract-utilities" Mar 07 07:24:00 crc kubenswrapper[4941]: E0307 07:24:00.151308 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a0122b-d8af-4ab6-b937-40f5cecf86b6" containerName="registry-server" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.151317 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a0122b-d8af-4ab6-b937-40f5cecf86b6" containerName="registry-server" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.151582 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a0122b-d8af-4ab6-b937-40f5cecf86b6" containerName="registry-server" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.152115 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547804-hkn5q" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.156470 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.156830 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.159256 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.167723 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547804-hkn5q"] Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.311168 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzpg4\" (UniqueName: \"kubernetes.io/projected/325ab554-4cc1-4530-b655-369b62854330-kube-api-access-lzpg4\") pod \"auto-csr-approver-29547804-hkn5q\" (UID: \"325ab554-4cc1-4530-b655-369b62854330\") " pod="openshift-infra/auto-csr-approver-29547804-hkn5q" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.413037 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzpg4\" (UniqueName: \"kubernetes.io/projected/325ab554-4cc1-4530-b655-369b62854330-kube-api-access-lzpg4\") pod \"auto-csr-approver-29547804-hkn5q\" (UID: \"325ab554-4cc1-4530-b655-369b62854330\") " pod="openshift-infra/auto-csr-approver-29547804-hkn5q" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.448960 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzpg4\" (UniqueName: \"kubernetes.io/projected/325ab554-4cc1-4530-b655-369b62854330-kube-api-access-lzpg4\") pod \"auto-csr-approver-29547804-hkn5q\" (UID: \"325ab554-4cc1-4530-b655-369b62854330\") " pod="openshift-infra/auto-csr-approver-29547804-hkn5q" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.477249 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547804-hkn5q" Mar 07 07:24:00 crc kubenswrapper[4941]: I0307 07:24:00.693180 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547804-hkn5q"] Mar 07 07:24:01 crc kubenswrapper[4941]: I0307 07:24:01.282854 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547804-hkn5q" event={"ID":"325ab554-4cc1-4530-b655-369b62854330","Type":"ContainerStarted","Data":"44e020cc4a37026a5749a94c74718b0c6699e4209e6d281d5ce05756bac29588"} Mar 07 07:24:02 crc kubenswrapper[4941]: I0307 07:24:02.291364 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547804-hkn5q" event={"ID":"325ab554-4cc1-4530-b655-369b62854330","Type":"ContainerStarted","Data":"83c353ce79f938e4dba9e0d39329ecd3a6eb6a52c9f3988ce15c5788ada3e8d0"} Mar 07 07:24:02 crc kubenswrapper[4941]: I0307 07:24:02.305864 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547804-hkn5q" podStartSLOduration=1.2871332 podStartE2EDuration="2.305835336s" podCreationTimestamp="2026-03-07 07:24:00 +0000 UTC" firstStartedPulling="2026-03-07 07:24:00.707206405 +0000 UTC m=+1937.659571870" lastFinishedPulling="2026-03-07 07:24:01.725908511 +0000 UTC m=+1938.678274006" observedRunningTime="2026-03-07 07:24:02.303103247 +0000 UTC m=+1939.255468752" watchObservedRunningTime="2026-03-07 07:24:02.305835336 +0000 UTC m=+1939.258200841" Mar 07 07:24:03 crc kubenswrapper[4941]: I0307 07:24:03.301705 4941 generic.go:334] "Generic (PLEG): container finished" podID="325ab554-4cc1-4530-b655-369b62854330" containerID="83c353ce79f938e4dba9e0d39329ecd3a6eb6a52c9f3988ce15c5788ada3e8d0" exitCode=0 Mar 07 07:24:03 crc kubenswrapper[4941]: I0307 07:24:03.301752 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547804-hkn5q" event={"ID":"325ab554-4cc1-4530-b655-369b62854330","Type":"ContainerDied","Data":"83c353ce79f938e4dba9e0d39329ecd3a6eb6a52c9f3988ce15c5788ada3e8d0"} Mar 07 07:24:04 crc kubenswrapper[4941]: I0307 07:24:04.627096 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547804-hkn5q" Mar 07 07:24:04 crc kubenswrapper[4941]: I0307 07:24:04.776568 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzpg4\" (UniqueName: \"kubernetes.io/projected/325ab554-4cc1-4530-b655-369b62854330-kube-api-access-lzpg4\") pod \"325ab554-4cc1-4530-b655-369b62854330\" (UID: \"325ab554-4cc1-4530-b655-369b62854330\") " Mar 07 07:24:04 crc kubenswrapper[4941]: I0307 07:24:04.781600 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/325ab554-4cc1-4530-b655-369b62854330-kube-api-access-lzpg4" (OuterVolumeSpecName: "kube-api-access-lzpg4") pod "325ab554-4cc1-4530-b655-369b62854330" (UID: "325ab554-4cc1-4530-b655-369b62854330"). InnerVolumeSpecName "kube-api-access-lzpg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:24:04 crc kubenswrapper[4941]: I0307 07:24:04.877591 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzpg4\" (UniqueName: \"kubernetes.io/projected/325ab554-4cc1-4530-b655-369b62854330-kube-api-access-lzpg4\") on node \"crc\" DevicePath \"\"" Mar 07 07:24:05 crc kubenswrapper[4941]: I0307 07:24:05.316131 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547804-hkn5q" event={"ID":"325ab554-4cc1-4530-b655-369b62854330","Type":"ContainerDied","Data":"44e020cc4a37026a5749a94c74718b0c6699e4209e6d281d5ce05756bac29588"} Mar 07 07:24:05 crc kubenswrapper[4941]: I0307 07:24:05.316473 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44e020cc4a37026a5749a94c74718b0c6699e4209e6d281d5ce05756bac29588" Mar 07 07:24:05 crc kubenswrapper[4941]: I0307 07:24:05.316267 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547804-hkn5q" Mar 07 07:24:05 crc kubenswrapper[4941]: I0307 07:24:05.373332 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547798-dmcvg"] Mar 07 07:24:05 crc kubenswrapper[4941]: I0307 07:24:05.378492 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547798-dmcvg"] Mar 07 07:24:05 crc kubenswrapper[4941]: I0307 07:24:05.964519 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74764658-d199-4d6b-8d0f-b04f0839500c" path="/var/lib/kubelet/pods/74764658-d199-4d6b-8d0f-b04f0839500c/volumes" Mar 07 07:24:07 crc kubenswrapper[4941]: I0307 07:24:07.989602 4941 scope.go:117] "RemoveContainer" containerID="874045616cd659190821ba7f859a7926099cc05a1035859abdb89bcd9b16bddb" Mar 07 07:25:40 crc kubenswrapper[4941]: I0307 07:25:40.314534 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:25:40 crc kubenswrapper[4941]: I0307 07:25:40.316296 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:26:00 crc kubenswrapper[4941]: I0307 07:26:00.165082 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547806-2ktwf"] Mar 07 07:26:00 crc kubenswrapper[4941]: E0307 07:26:00.165909 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="325ab554-4cc1-4530-b655-369b62854330" containerName="oc" Mar 07 07:26:00 crc kubenswrapper[4941]: I0307 07:26:00.165922 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="325ab554-4cc1-4530-b655-369b62854330" containerName="oc" Mar 07 07:26:00 crc kubenswrapper[4941]: I0307 07:26:00.166063 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="325ab554-4cc1-4530-b655-369b62854330" containerName="oc" Mar 07 07:26:00 crc kubenswrapper[4941]: I0307 07:26:00.166505 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547806-2ktwf" Mar 07 07:26:00 crc kubenswrapper[4941]: I0307 07:26:00.169817 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:26:00 crc kubenswrapper[4941]: I0307 07:26:00.170209 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:26:00 crc kubenswrapper[4941]: I0307 07:26:00.173304 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:26:00 crc kubenswrapper[4941]: I0307 07:26:00.182910 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547806-2ktwf"] Mar 07 07:26:00 crc kubenswrapper[4941]: I0307 07:26:00.335719 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km4jv\" (UniqueName: \"kubernetes.io/projected/1e971412-547e-4b69-b4e7-6b3f3081eb92-kube-api-access-km4jv\") pod \"auto-csr-approver-29547806-2ktwf\" (UID: \"1e971412-547e-4b69-b4e7-6b3f3081eb92\") " pod="openshift-infra/auto-csr-approver-29547806-2ktwf" Mar 07 07:26:00 crc kubenswrapper[4941]: I0307 07:26:00.438245 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km4jv\" (UniqueName: \"kubernetes.io/projected/1e971412-547e-4b69-b4e7-6b3f3081eb92-kube-api-access-km4jv\") pod \"auto-csr-approver-29547806-2ktwf\" (UID: \"1e971412-547e-4b69-b4e7-6b3f3081eb92\") " pod="openshift-infra/auto-csr-approver-29547806-2ktwf" Mar 07 07:26:00 crc kubenswrapper[4941]: I0307 07:26:00.473382 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km4jv\" (UniqueName: \"kubernetes.io/projected/1e971412-547e-4b69-b4e7-6b3f3081eb92-kube-api-access-km4jv\") pod \"auto-csr-approver-29547806-2ktwf\" (UID: \"1e971412-547e-4b69-b4e7-6b3f3081eb92\") " pod="openshift-infra/auto-csr-approver-29547806-2ktwf" Mar 07 07:26:00 crc kubenswrapper[4941]: I0307 07:26:00.495271 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547806-2ktwf" Mar 07 07:26:00 crc kubenswrapper[4941]: I0307 07:26:00.955595 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547806-2ktwf"] Mar 07 07:26:01 crc kubenswrapper[4941]: I0307 07:26:01.313559 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547806-2ktwf" event={"ID":"1e971412-547e-4b69-b4e7-6b3f3081eb92","Type":"ContainerStarted","Data":"cb369f05769c5a04996da4c7d40adad6ccc9a5035090e81c87c0cf98947ef573"} Mar 07 07:26:02 crc kubenswrapper[4941]: I0307 07:26:02.323550 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547806-2ktwf" event={"ID":"1e971412-547e-4b69-b4e7-6b3f3081eb92","Type":"ContainerStarted","Data":"46e9934c7c5e526698254db7a094fad5839752773304a0c53226f0e2a3986e83"} Mar 07 07:26:03 crc kubenswrapper[4941]: I0307 07:26:03.334699 4941 generic.go:334] "Generic (PLEG): container finished" podID="1e971412-547e-4b69-b4e7-6b3f3081eb92" containerID="46e9934c7c5e526698254db7a094fad5839752773304a0c53226f0e2a3986e83" exitCode=0 Mar 07 07:26:03 crc kubenswrapper[4941]: I0307 07:26:03.334809 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547806-2ktwf" event={"ID":"1e971412-547e-4b69-b4e7-6b3f3081eb92","Type":"ContainerDied","Data":"46e9934c7c5e526698254db7a094fad5839752773304a0c53226f0e2a3986e83"} Mar 07 07:26:04 crc kubenswrapper[4941]: I0307 07:26:04.728071 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547806-2ktwf" Mar 07 07:26:04 crc kubenswrapper[4941]: I0307 07:26:04.806181 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km4jv\" (UniqueName: \"kubernetes.io/projected/1e971412-547e-4b69-b4e7-6b3f3081eb92-kube-api-access-km4jv\") pod \"1e971412-547e-4b69-b4e7-6b3f3081eb92\" (UID: \"1e971412-547e-4b69-b4e7-6b3f3081eb92\") " Mar 07 07:26:04 crc kubenswrapper[4941]: I0307 07:26:04.810797 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e971412-547e-4b69-b4e7-6b3f3081eb92-kube-api-access-km4jv" (OuterVolumeSpecName: "kube-api-access-km4jv") pod "1e971412-547e-4b69-b4e7-6b3f3081eb92" (UID: "1e971412-547e-4b69-b4e7-6b3f3081eb92"). InnerVolumeSpecName "kube-api-access-km4jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:26:04 crc kubenswrapper[4941]: I0307 07:26:04.908189 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km4jv\" (UniqueName: \"kubernetes.io/projected/1e971412-547e-4b69-b4e7-6b3f3081eb92-kube-api-access-km4jv\") on node \"crc\" DevicePath \"\"" Mar 07 07:26:05 crc kubenswrapper[4941]: I0307 07:26:05.356462 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547806-2ktwf" event={"ID":"1e971412-547e-4b69-b4e7-6b3f3081eb92","Type":"ContainerDied","Data":"cb369f05769c5a04996da4c7d40adad6ccc9a5035090e81c87c0cf98947ef573"} Mar 07 07:26:05 crc kubenswrapper[4941]: I0307 07:26:05.356918 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb369f05769c5a04996da4c7d40adad6ccc9a5035090e81c87c0cf98947ef573" Mar 07 07:26:05 crc kubenswrapper[4941]: I0307 07:26:05.356566 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547806-2ktwf" Mar 07 07:26:05 crc kubenswrapper[4941]: I0307 07:26:05.436829 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547800-gkbnj"] Mar 07 07:26:05 crc kubenswrapper[4941]: I0307 07:26:05.446548 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547800-gkbnj"] Mar 07 07:26:05 crc kubenswrapper[4941]: I0307 07:26:05.979202 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e29a8223-a33f-47bd-b4a6-6053f8b0af5a" path="/var/lib/kubelet/pods/e29a8223-a33f-47bd-b4a6-6053f8b0af5a/volumes" Mar 07 07:26:08 crc kubenswrapper[4941]: I0307 07:26:08.101761 4941 scope.go:117] "RemoveContainer" containerID="9396222f588da070376b56bef1762286810c8fa1663143ad9c303bdfa91f38fc" Mar 07 07:26:10 crc kubenswrapper[4941]: I0307 07:26:10.314055 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:26:10 crc kubenswrapper[4941]: I0307 07:26:10.314132 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:26:40 crc kubenswrapper[4941]: I0307 07:26:40.314514 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:26:40 crc kubenswrapper[4941]: I0307 07:26:40.315221 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:26:40 crc kubenswrapper[4941]: I0307 07:26:40.315273 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 07:26:40 crc kubenswrapper[4941]: I0307 07:26:40.316037 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a88073c6303a571ed602b8eda7126dc414ea6424a5b0973cb131c83c8213e24"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:26:40 crc kubenswrapper[4941]: I0307 07:26:40.316112 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://7a88073c6303a571ed602b8eda7126dc414ea6424a5b0973cb131c83c8213e24" gracePeriod=600 Mar 07 07:26:40 crc kubenswrapper[4941]: I0307 07:26:40.678360 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="7a88073c6303a571ed602b8eda7126dc414ea6424a5b0973cb131c83c8213e24" exitCode=0 Mar 07 07:26:40 crc kubenswrapper[4941]: I0307 07:26:40.678455 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"7a88073c6303a571ed602b8eda7126dc414ea6424a5b0973cb131c83c8213e24"} Mar 07 07:26:40 crc kubenswrapper[4941]: I0307 07:26:40.678840 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85"} Mar 07 07:26:40 crc kubenswrapper[4941]: I0307 07:26:40.678873 4941 scope.go:117] "RemoveContainer" containerID="c0eb8a9b6600255020e363972055bd483dd68176b231e7308da1106e2b95fa97" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.266123 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nv5sd"] Mar 07 07:27:51 crc kubenswrapper[4941]: E0307 07:27:51.267259 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e971412-547e-4b69-b4e7-6b3f3081eb92" containerName="oc" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.267281 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e971412-547e-4b69-b4e7-6b3f3081eb92" containerName="oc" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.267544 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e971412-547e-4b69-b4e7-6b3f3081eb92" containerName="oc" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.269066 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.281239 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nv5sd"] Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.386594 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-utilities\") pod \"redhat-operators-nv5sd\" (UID: \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\") " pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.386667 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r558h\" (UniqueName: \"kubernetes.io/projected/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-kube-api-access-r558h\") pod \"redhat-operators-nv5sd\" (UID: \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\") " pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.386739 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-catalog-content\") pod \"redhat-operators-nv5sd\" (UID: \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\") " pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.488244 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-utilities\") pod \"redhat-operators-nv5sd\" (UID: \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\") " pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.488304 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r558h\" (UniqueName: \"kubernetes.io/projected/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-kube-api-access-r558h\") pod \"redhat-operators-nv5sd\" (UID: \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\") " pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.488371 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-catalog-content\") pod \"redhat-operators-nv5sd\" (UID: \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\") " pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.488750 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-utilities\") pod \"redhat-operators-nv5sd\" (UID: \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\") " pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.488825 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-catalog-content\") pod \"redhat-operators-nv5sd\" (UID: \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\") " pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.508123 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r558h\" (UniqueName: \"kubernetes.io/projected/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-kube-api-access-r558h\") pod \"redhat-operators-nv5sd\" (UID: \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\") " pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:27:51 crc kubenswrapper[4941]: I0307 07:27:51.612144 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:27:52 crc kubenswrapper[4941]: I0307 07:27:52.062849 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nv5sd"] Mar 07 07:27:52 crc kubenswrapper[4941]: I0307 07:27:52.306168 4941 generic.go:334] "Generic (PLEG): container finished" podID="47aad1a0-788e-49ef-9dcb-6075b34a0ac9" containerID="ade07e0a18955f52933fc4da4b3ea09d35b252af9a5a9ac9ec4807bd49a83544" exitCode=0 Mar 07 07:27:52 crc kubenswrapper[4941]: I0307 07:27:52.306277 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nv5sd" event={"ID":"47aad1a0-788e-49ef-9dcb-6075b34a0ac9","Type":"ContainerDied","Data":"ade07e0a18955f52933fc4da4b3ea09d35b252af9a5a9ac9ec4807bd49a83544"} Mar 07 07:27:52 crc kubenswrapper[4941]: I0307 07:27:52.306460 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nv5sd" event={"ID":"47aad1a0-788e-49ef-9dcb-6075b34a0ac9","Type":"ContainerStarted","Data":"f5999a67d4249ea33d4b28531763b6697d371cef3f11d614400c637e6d0f6792"} Mar 07 07:27:53 crc kubenswrapper[4941]: I0307 07:27:53.317655 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nv5sd" event={"ID":"47aad1a0-788e-49ef-9dcb-6075b34a0ac9","Type":"ContainerStarted","Data":"2203db70472a424189576e8c3c420c78a08f3ad79a3f391c7c020268e4f10bb7"} Mar 07 07:27:54 crc kubenswrapper[4941]: I0307 07:27:54.329010 4941 generic.go:334] "Generic (PLEG): container finished" podID="47aad1a0-788e-49ef-9dcb-6075b34a0ac9" containerID="2203db70472a424189576e8c3c420c78a08f3ad79a3f391c7c020268e4f10bb7" exitCode=0 Mar 07 07:27:54 crc kubenswrapper[4941]: I0307 07:27:54.329199 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nv5sd" event={"ID":"47aad1a0-788e-49ef-9dcb-6075b34a0ac9","Type":"ContainerDied","Data":"2203db70472a424189576e8c3c420c78a08f3ad79a3f391c7c020268e4f10bb7"} Mar 07 07:27:55 crc kubenswrapper[4941]: I0307 07:27:55.336879 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nv5sd" event={"ID":"47aad1a0-788e-49ef-9dcb-6075b34a0ac9","Type":"ContainerStarted","Data":"5f53070ef9d6a4aac7f62a70bbc9041e125ed26b3bf225f71120363d2a5b27d1"} Mar 07 07:27:55 crc kubenswrapper[4941]: I0307 07:27:55.358008 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nv5sd" podStartSLOduration=1.918724678 podStartE2EDuration="4.357984139s" podCreationTimestamp="2026-03-07 07:27:51 +0000 UTC" firstStartedPulling="2026-03-07 07:27:52.307494395 +0000 UTC m=+2169.259859860" lastFinishedPulling="2026-03-07 07:27:54.746753836 +0000 UTC m=+2171.699119321" observedRunningTime="2026-03-07 07:27:55.353979891 +0000 UTC m=+2172.306345356" watchObservedRunningTime="2026-03-07 07:27:55.357984139 +0000 UTC m=+2172.310349644" Mar 07 07:28:00 crc kubenswrapper[4941]: I0307 07:28:00.153737 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547808-br5ss"] Mar 07 07:28:00 crc kubenswrapper[4941]: I0307 07:28:00.155227 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547808-br5ss" Mar 07 07:28:00 crc kubenswrapper[4941]: I0307 07:28:00.157384 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:28:00 crc kubenswrapper[4941]: I0307 07:28:00.157574 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:28:00 crc kubenswrapper[4941]: I0307 07:28:00.158057 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:28:00 crc kubenswrapper[4941]: I0307 07:28:00.177663 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547808-br5ss"] Mar 07 07:28:00 crc kubenswrapper[4941]: I0307 07:28:00.313913 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbjbn\" (UniqueName: \"kubernetes.io/projected/0d6549ed-e6d7-409f-9a6d-995abd2a46a4-kube-api-access-zbjbn\") pod \"auto-csr-approver-29547808-br5ss\" (UID: \"0d6549ed-e6d7-409f-9a6d-995abd2a46a4\") " pod="openshift-infra/auto-csr-approver-29547808-br5ss" Mar 07 07:28:00 crc kubenswrapper[4941]: I0307 07:28:00.415004 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbjbn\" (UniqueName: \"kubernetes.io/projected/0d6549ed-e6d7-409f-9a6d-995abd2a46a4-kube-api-access-zbjbn\") pod \"auto-csr-approver-29547808-br5ss\" (UID: \"0d6549ed-e6d7-409f-9a6d-995abd2a46a4\") " pod="openshift-infra/auto-csr-approver-29547808-br5ss" Mar 07 07:28:00 crc kubenswrapper[4941]: I0307 07:28:00.443280 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbjbn\" (UniqueName: \"kubernetes.io/projected/0d6549ed-e6d7-409f-9a6d-995abd2a46a4-kube-api-access-zbjbn\") pod \"auto-csr-approver-29547808-br5ss\" (UID: \"0d6549ed-e6d7-409f-9a6d-995abd2a46a4\") " pod="openshift-infra/auto-csr-approver-29547808-br5ss" Mar 07 07:28:00 crc kubenswrapper[4941]: I0307 07:28:00.496529 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547808-br5ss" Mar 07 07:28:00 crc kubenswrapper[4941]: I0307 07:28:00.981092 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547808-br5ss"] Mar 07 07:28:01 crc kubenswrapper[4941]: I0307 07:28:01.382863 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547808-br5ss" event={"ID":"0d6549ed-e6d7-409f-9a6d-995abd2a46a4","Type":"ContainerStarted","Data":"b00ca0a326e7f72637049909e058e4a990232312126d176c30ed64c58fc225e6"} Mar 07 07:28:01 crc kubenswrapper[4941]: I0307 07:28:01.613274 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:28:01 crc kubenswrapper[4941]: I0307 07:28:01.613351 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:28:01 crc kubenswrapper[4941]: I0307 07:28:01.680947 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:28:02 crc kubenswrapper[4941]: I0307 07:28:02.392904 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547808-br5ss" event={"ID":"0d6549ed-e6d7-409f-9a6d-995abd2a46a4","Type":"ContainerStarted","Data":"893dfb60219c187fb5841ab7f2877c6e849082f254afe142c0b4675c00c8f782"} Mar 07 07:28:02 crc kubenswrapper[4941]: I0307 07:28:02.411547 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547808-br5ss" podStartSLOduration=1.363415796 podStartE2EDuration="2.411519376s" podCreationTimestamp="2026-03-07 07:28:00 +0000 UTC" firstStartedPulling="2026-03-07 07:28:01.000233268 +0000 UTC m=+2177.952598743" lastFinishedPulling="2026-03-07 07:28:02.048336848 +0000 UTC m=+2179.000702323" observedRunningTime="2026-03-07 07:28:02.407502767 +0000 UTC m=+2179.359868242" watchObservedRunningTime="2026-03-07 07:28:02.411519376 +0000 UTC m=+2179.363884841" Mar 07 07:28:02 crc kubenswrapper[4941]: I0307 07:28:02.455979 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:28:02 crc kubenswrapper[4941]: I0307 07:28:02.512933 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nv5sd"] Mar 07 07:28:03 crc kubenswrapper[4941]: I0307 07:28:03.402181 4941 generic.go:334] "Generic (PLEG): container finished" podID="0d6549ed-e6d7-409f-9a6d-995abd2a46a4" containerID="893dfb60219c187fb5841ab7f2877c6e849082f254afe142c0b4675c00c8f782" exitCode=0 Mar 07 07:28:03 crc kubenswrapper[4941]: I0307 07:28:03.402265 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547808-br5ss" event={"ID":"0d6549ed-e6d7-409f-9a6d-995abd2a46a4","Type":"ContainerDied","Data":"893dfb60219c187fb5841ab7f2877c6e849082f254afe142c0b4675c00c8f782"} Mar 07 07:28:04 crc kubenswrapper[4941]: I0307 07:28:04.408234 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nv5sd" podUID="47aad1a0-788e-49ef-9dcb-6075b34a0ac9" containerName="registry-server" containerID="cri-o://5f53070ef9d6a4aac7f62a70bbc9041e125ed26b3bf225f71120363d2a5b27d1" gracePeriod=2 Mar 07 07:28:04 crc kubenswrapper[4941]: I0307 07:28:04.763082 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547808-br5ss" Mar 07 07:28:04 crc kubenswrapper[4941]: I0307 07:28:04.791695 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbjbn\" (UniqueName: \"kubernetes.io/projected/0d6549ed-e6d7-409f-9a6d-995abd2a46a4-kube-api-access-zbjbn\") pod \"0d6549ed-e6d7-409f-9a6d-995abd2a46a4\" (UID: \"0d6549ed-e6d7-409f-9a6d-995abd2a46a4\") " Mar 07 07:28:04 crc kubenswrapper[4941]: I0307 07:28:04.803667 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d6549ed-e6d7-409f-9a6d-995abd2a46a4-kube-api-access-zbjbn" (OuterVolumeSpecName: "kube-api-access-zbjbn") pod "0d6549ed-e6d7-409f-9a6d-995abd2a46a4" (UID: "0d6549ed-e6d7-409f-9a6d-995abd2a46a4"). InnerVolumeSpecName "kube-api-access-zbjbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:28:04 crc kubenswrapper[4941]: I0307 07:28:04.892849 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbjbn\" (UniqueName: \"kubernetes.io/projected/0d6549ed-e6d7-409f-9a6d-995abd2a46a4-kube-api-access-zbjbn\") on node \"crc\" DevicePath \"\"" Mar 07 07:28:05 crc kubenswrapper[4941]: I0307 07:28:05.421771 4941 generic.go:334] "Generic (PLEG): container finished" podID="47aad1a0-788e-49ef-9dcb-6075b34a0ac9" containerID="5f53070ef9d6a4aac7f62a70bbc9041e125ed26b3bf225f71120363d2a5b27d1" exitCode=0 Mar 07 07:28:05 crc kubenswrapper[4941]: I0307 07:28:05.421843 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nv5sd" event={"ID":"47aad1a0-788e-49ef-9dcb-6075b34a0ac9","Type":"ContainerDied","Data":"5f53070ef9d6a4aac7f62a70bbc9041e125ed26b3bf225f71120363d2a5b27d1"} Mar 07 07:28:05 crc kubenswrapper[4941]: I0307 07:28:05.424017 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547808-br5ss" event={"ID":"0d6549ed-e6d7-409f-9a6d-995abd2a46a4","Type":"ContainerDied","Data":"b00ca0a326e7f72637049909e058e4a990232312126d176c30ed64c58fc225e6"} Mar 07 07:28:05 crc kubenswrapper[4941]: I0307 07:28:05.424074 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b00ca0a326e7f72637049909e058e4a990232312126d176c30ed64c58fc225e6" Mar 07 07:28:05 crc kubenswrapper[4941]: I0307 07:28:05.424110 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547808-br5ss" Mar 07 07:28:05 crc kubenswrapper[4941]: I0307 07:28:05.482637 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547802-tfzjw"] Mar 07 07:28:05 crc kubenswrapper[4941]: I0307 07:28:05.489457 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547802-tfzjw"] Mar 07 07:28:05 crc kubenswrapper[4941]: I0307 07:28:05.963552 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7fcce16-6076-4b6b-af87-6d693670242c" path="/var/lib/kubelet/pods/a7fcce16-6076-4b6b-af87-6d693670242c/volumes" Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.037724 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.207895 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r558h\" (UniqueName: \"kubernetes.io/projected/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-kube-api-access-r558h\") pod \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\" (UID: \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\") " Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.208027 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-utilities\") pod \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\" (UID: \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\") " Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.208060 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-catalog-content\") pod \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\" (UID: \"47aad1a0-788e-49ef-9dcb-6075b34a0ac9\") " Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.209854 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-utilities" (OuterVolumeSpecName: "utilities") pod "47aad1a0-788e-49ef-9dcb-6075b34a0ac9" (UID: "47aad1a0-788e-49ef-9dcb-6075b34a0ac9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.217506 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-kube-api-access-r558h" (OuterVolumeSpecName: "kube-api-access-r558h") pod "47aad1a0-788e-49ef-9dcb-6075b34a0ac9" (UID: "47aad1a0-788e-49ef-9dcb-6075b34a0ac9"). InnerVolumeSpecName "kube-api-access-r558h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.309553 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r558h\" (UniqueName: \"kubernetes.io/projected/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-kube-api-access-r558h\") on node \"crc\" DevicePath \"\"" Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.309868 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.369925 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47aad1a0-788e-49ef-9dcb-6075b34a0ac9" (UID: "47aad1a0-788e-49ef-9dcb-6075b34a0ac9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.411242 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47aad1a0-788e-49ef-9dcb-6075b34a0ac9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.431615 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nv5sd" event={"ID":"47aad1a0-788e-49ef-9dcb-6075b34a0ac9","Type":"ContainerDied","Data":"f5999a67d4249ea33d4b28531763b6697d371cef3f11d614400c637e6d0f6792"} Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.431649 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nv5sd" Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.431680 4941 scope.go:117] "RemoveContainer" containerID="5f53070ef9d6a4aac7f62a70bbc9041e125ed26b3bf225f71120363d2a5b27d1" Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.449455 4941 scope.go:117] "RemoveContainer" containerID="2203db70472a424189576e8c3c420c78a08f3ad79a3f391c7c020268e4f10bb7" Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.462613 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nv5sd"] Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.468063 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nv5sd"] Mar 07 07:28:06 crc kubenswrapper[4941]: I0307 07:28:06.472732 4941 scope.go:117] "RemoveContainer" containerID="ade07e0a18955f52933fc4da4b3ea09d35b252af9a5a9ac9ec4807bd49a83544" Mar 07 07:28:07 crc kubenswrapper[4941]: I0307 07:28:07.979999 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47aad1a0-788e-49ef-9dcb-6075b34a0ac9" path="/var/lib/kubelet/pods/47aad1a0-788e-49ef-9dcb-6075b34a0ac9/volumes" Mar 07 07:28:08 crc kubenswrapper[4941]: I0307 07:28:08.185519 4941 scope.go:117] "RemoveContainer" containerID="04fb31e637ed76e095c99f1a40c4aac6c7bc397114c1ff9051c6fa3b7cb845d3" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.683500 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mrf8v"] Mar 07 07:28:33 crc kubenswrapper[4941]: E0307 07:28:33.684255 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d6549ed-e6d7-409f-9a6d-995abd2a46a4" containerName="oc" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.684271 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d6549ed-e6d7-409f-9a6d-995abd2a46a4" containerName="oc" Mar 07 07:28:33 crc kubenswrapper[4941]: E0307 07:28:33.684295 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47aad1a0-788e-49ef-9dcb-6075b34a0ac9" containerName="registry-server" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.684302 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="47aad1a0-788e-49ef-9dcb-6075b34a0ac9" containerName="registry-server" Mar 07 07:28:33 crc kubenswrapper[4941]: E0307 07:28:33.684314 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47aad1a0-788e-49ef-9dcb-6075b34a0ac9" containerName="extract-content" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.684322 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="47aad1a0-788e-49ef-9dcb-6075b34a0ac9" containerName="extract-content" Mar 07 07:28:33 crc kubenswrapper[4941]: E0307 07:28:33.684348 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47aad1a0-788e-49ef-9dcb-6075b34a0ac9" containerName="extract-utilities" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.684356 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="47aad1a0-788e-49ef-9dcb-6075b34a0ac9" containerName="extract-utilities" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.684563 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="47aad1a0-788e-49ef-9dcb-6075b34a0ac9" containerName="registry-server" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.684580 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d6549ed-e6d7-409f-9a6d-995abd2a46a4" containerName="oc" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.685708 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.705596 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mrf8v"] Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.804744 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt96d\" (UniqueName: \"kubernetes.io/projected/d64bc0e3-30f7-4bcf-8348-b691bfed586b-kube-api-access-rt96d\") pod \"community-operators-mrf8v\" (UID: \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\") " pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.805047 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d64bc0e3-30f7-4bcf-8348-b691bfed586b-utilities\") pod \"community-operators-mrf8v\" (UID: \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\") " pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.805188 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d64bc0e3-30f7-4bcf-8348-b691bfed586b-catalog-content\") pod \"community-operators-mrf8v\" (UID: \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\") " pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.906358 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt96d\" (UniqueName: \"kubernetes.io/projected/d64bc0e3-30f7-4bcf-8348-b691bfed586b-kube-api-access-rt96d\") pod \"community-operators-mrf8v\" (UID: \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\") " pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.906422 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d64bc0e3-30f7-4bcf-8348-b691bfed586b-utilities\") pod \"community-operators-mrf8v\" (UID: \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\") " pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.906446 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d64bc0e3-30f7-4bcf-8348-b691bfed586b-catalog-content\") pod \"community-operators-mrf8v\" (UID: \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\") " pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.907037 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d64bc0e3-30f7-4bcf-8348-b691bfed586b-catalog-content\") pod \"community-operators-mrf8v\" (UID: \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\") " pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.907069 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d64bc0e3-30f7-4bcf-8348-b691bfed586b-utilities\") pod \"community-operators-mrf8v\" (UID: \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\") " pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:33 crc kubenswrapper[4941]: I0307 07:28:33.933099 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt96d\" (UniqueName: \"kubernetes.io/projected/d64bc0e3-30f7-4bcf-8348-b691bfed586b-kube-api-access-rt96d\") pod \"community-operators-mrf8v\" (UID: \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\") " pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:34 crc kubenswrapper[4941]: I0307 07:28:34.006484 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:34 crc kubenswrapper[4941]: I0307 07:28:34.463344 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mrf8v"] Mar 07 07:28:34 crc kubenswrapper[4941]: W0307 07:28:34.468720 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd64bc0e3_30f7_4bcf_8348_b691bfed586b.slice/crio-e3fd005c8eba2c0850ddaf02d224d4c47e190ee0c36cf0141e3d0ed1dcfab60d WatchSource:0}: Error finding container e3fd005c8eba2c0850ddaf02d224d4c47e190ee0c36cf0141e3d0ed1dcfab60d: Status 404 returned error can't find the container with id e3fd005c8eba2c0850ddaf02d224d4c47e190ee0c36cf0141e3d0ed1dcfab60d Mar 07 07:28:34 crc kubenswrapper[4941]: I0307 07:28:34.671576 4941 generic.go:334] "Generic (PLEG): container finished" podID="d64bc0e3-30f7-4bcf-8348-b691bfed586b" containerID="c6e3da26fc525facd023dd3f84a351cf3733535247b3de9214ba58c415fd97b4" exitCode=0 Mar 07 07:28:34 crc kubenswrapper[4941]: I0307 07:28:34.671632 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrf8v" event={"ID":"d64bc0e3-30f7-4bcf-8348-b691bfed586b","Type":"ContainerDied","Data":"c6e3da26fc525facd023dd3f84a351cf3733535247b3de9214ba58c415fd97b4"} Mar 07 07:28:34 crc kubenswrapper[4941]: I0307 07:28:34.671924 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrf8v" event={"ID":"d64bc0e3-30f7-4bcf-8348-b691bfed586b","Type":"ContainerStarted","Data":"e3fd005c8eba2c0850ddaf02d224d4c47e190ee0c36cf0141e3d0ed1dcfab60d"} Mar 07 07:28:35 crc kubenswrapper[4941]: I0307 07:28:35.679723 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrf8v" event={"ID":"d64bc0e3-30f7-4bcf-8348-b691bfed586b","Type":"ContainerStarted","Data":"ab2d7d577be6978c01dfb96456799f93bf0684fff3bc16698583bec1e6daea8d"} Mar 07 07:28:36 crc kubenswrapper[4941]: I0307 07:28:36.690345 4941 generic.go:334] "Generic (PLEG): container finished" podID="d64bc0e3-30f7-4bcf-8348-b691bfed586b" containerID="ab2d7d577be6978c01dfb96456799f93bf0684fff3bc16698583bec1e6daea8d" exitCode=0 Mar 07 07:28:36 crc kubenswrapper[4941]: I0307 07:28:36.690396 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrf8v" event={"ID":"d64bc0e3-30f7-4bcf-8348-b691bfed586b","Type":"ContainerDied","Data":"ab2d7d577be6978c01dfb96456799f93bf0684fff3bc16698583bec1e6daea8d"} Mar 07 07:28:37 crc kubenswrapper[4941]: I0307 07:28:37.699587 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrf8v" event={"ID":"d64bc0e3-30f7-4bcf-8348-b691bfed586b","Type":"ContainerStarted","Data":"a3369d1c74156b6a656d78295e2398fad70adcd1004cc73a1f028d65580e8cf6"} Mar 07 07:28:40 crc kubenswrapper[4941]: I0307 07:28:40.314324 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:28:40 crc kubenswrapper[4941]: I0307 07:28:40.314643 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:28:44 crc kubenswrapper[4941]: I0307 07:28:44.007510 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:44 crc kubenswrapper[4941]: I0307 07:28:44.007921 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:44 crc kubenswrapper[4941]: I0307 07:28:44.054242 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:44 crc kubenswrapper[4941]: I0307 07:28:44.077564 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mrf8v" podStartSLOduration=8.636990446 podStartE2EDuration="11.077542469s" podCreationTimestamp="2026-03-07 07:28:33 +0000 UTC" firstStartedPulling="2026-03-07 07:28:34.673345308 +0000 UTC m=+2211.625710793" lastFinishedPulling="2026-03-07 07:28:37.113897341 +0000 UTC m=+2214.066262816" observedRunningTime="2026-03-07 07:28:37.718254514 +0000 UTC m=+2214.670619979" watchObservedRunningTime="2026-03-07 07:28:44.077542469 +0000 UTC m=+2221.029907944" Mar 07 07:28:44 crc kubenswrapper[4941]: I0307 07:28:44.821618 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:44 crc kubenswrapper[4941]: I0307 07:28:44.879696 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mrf8v"] Mar 07 07:28:46 crc kubenswrapper[4941]: I0307 07:28:46.783981 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mrf8v" podUID="d64bc0e3-30f7-4bcf-8348-b691bfed586b" containerName="registry-server" containerID="cri-o://a3369d1c74156b6a656d78295e2398fad70adcd1004cc73a1f028d65580e8cf6" gracePeriod=2 Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.181401 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.250798 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d64bc0e3-30f7-4bcf-8348-b691bfed586b-utilities\") pod \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\" (UID: \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\") " Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.250905 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d64bc0e3-30f7-4bcf-8348-b691bfed586b-catalog-content\") pod \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\" (UID: \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\") " Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.250959 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt96d\" (UniqueName: \"kubernetes.io/projected/d64bc0e3-30f7-4bcf-8348-b691bfed586b-kube-api-access-rt96d\") pod \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\" (UID: \"d64bc0e3-30f7-4bcf-8348-b691bfed586b\") " Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.251824 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d64bc0e3-30f7-4bcf-8348-b691bfed586b-utilities" (OuterVolumeSpecName: "utilities") pod "d64bc0e3-30f7-4bcf-8348-b691bfed586b" (UID: "d64bc0e3-30f7-4bcf-8348-b691bfed586b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.255985 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d64bc0e3-30f7-4bcf-8348-b691bfed586b-kube-api-access-rt96d" (OuterVolumeSpecName: "kube-api-access-rt96d") pod "d64bc0e3-30f7-4bcf-8348-b691bfed586b" (UID: "d64bc0e3-30f7-4bcf-8348-b691bfed586b"). InnerVolumeSpecName "kube-api-access-rt96d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.305838 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d64bc0e3-30f7-4bcf-8348-b691bfed586b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d64bc0e3-30f7-4bcf-8348-b691bfed586b" (UID: "d64bc0e3-30f7-4bcf-8348-b691bfed586b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.352816 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d64bc0e3-30f7-4bcf-8348-b691bfed586b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.353093 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt96d\" (UniqueName: \"kubernetes.io/projected/d64bc0e3-30f7-4bcf-8348-b691bfed586b-kube-api-access-rt96d\") on node \"crc\" DevicePath \"\"" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.353187 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d64bc0e3-30f7-4bcf-8348-b691bfed586b-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.793264 4941 generic.go:334] "Generic (PLEG): container finished" podID="d64bc0e3-30f7-4bcf-8348-b691bfed586b" containerID="a3369d1c74156b6a656d78295e2398fad70adcd1004cc73a1f028d65580e8cf6" exitCode=0 Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.793327 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrf8v" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.793364 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrf8v" event={"ID":"d64bc0e3-30f7-4bcf-8348-b691bfed586b","Type":"ContainerDied","Data":"a3369d1c74156b6a656d78295e2398fad70adcd1004cc73a1f028d65580e8cf6"} Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.793722 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrf8v" event={"ID":"d64bc0e3-30f7-4bcf-8348-b691bfed586b","Type":"ContainerDied","Data":"e3fd005c8eba2c0850ddaf02d224d4c47e190ee0c36cf0141e3d0ed1dcfab60d"} Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.793757 4941 scope.go:117] "RemoveContainer" containerID="a3369d1c74156b6a656d78295e2398fad70adcd1004cc73a1f028d65580e8cf6" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.817644 4941 scope.go:117] "RemoveContainer" containerID="ab2d7d577be6978c01dfb96456799f93bf0684fff3bc16698583bec1e6daea8d" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.830272 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mrf8v"] Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.837246 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mrf8v"] Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.840446 4941 scope.go:117] "RemoveContainer" containerID="c6e3da26fc525facd023dd3f84a351cf3733535247b3de9214ba58c415fd97b4" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.875907 4941 scope.go:117] "RemoveContainer" containerID="a3369d1c74156b6a656d78295e2398fad70adcd1004cc73a1f028d65580e8cf6" Mar 07 07:28:47 crc kubenswrapper[4941]: E0307 07:28:47.876561 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3369d1c74156b6a656d78295e2398fad70adcd1004cc73a1f028d65580e8cf6\": container with ID starting with a3369d1c74156b6a656d78295e2398fad70adcd1004cc73a1f028d65580e8cf6 not found: ID does not exist" containerID="a3369d1c74156b6a656d78295e2398fad70adcd1004cc73a1f028d65580e8cf6" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.876599 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3369d1c74156b6a656d78295e2398fad70adcd1004cc73a1f028d65580e8cf6"} err="failed to get container status \"a3369d1c74156b6a656d78295e2398fad70adcd1004cc73a1f028d65580e8cf6\": rpc error: code = NotFound desc = could not find container \"a3369d1c74156b6a656d78295e2398fad70adcd1004cc73a1f028d65580e8cf6\": container with ID starting with a3369d1c74156b6a656d78295e2398fad70adcd1004cc73a1f028d65580e8cf6 not found: ID does not exist" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.876625 4941 scope.go:117] "RemoveContainer" containerID="ab2d7d577be6978c01dfb96456799f93bf0684fff3bc16698583bec1e6daea8d" Mar 07 07:28:47 crc kubenswrapper[4941]: E0307 07:28:47.877672 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab2d7d577be6978c01dfb96456799f93bf0684fff3bc16698583bec1e6daea8d\": container with ID starting with ab2d7d577be6978c01dfb96456799f93bf0684fff3bc16698583bec1e6daea8d not found: ID does not exist" containerID="ab2d7d577be6978c01dfb96456799f93bf0684fff3bc16698583bec1e6daea8d" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.877716 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab2d7d577be6978c01dfb96456799f93bf0684fff3bc16698583bec1e6daea8d"} err="failed to get container status \"ab2d7d577be6978c01dfb96456799f93bf0684fff3bc16698583bec1e6daea8d\": rpc error: code = NotFound desc = could not find container \"ab2d7d577be6978c01dfb96456799f93bf0684fff3bc16698583bec1e6daea8d\": container with ID starting with ab2d7d577be6978c01dfb96456799f93bf0684fff3bc16698583bec1e6daea8d not found: ID does not exist" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.877736 4941 scope.go:117] "RemoveContainer" containerID="c6e3da26fc525facd023dd3f84a351cf3733535247b3de9214ba58c415fd97b4" Mar 07 07:28:47 crc kubenswrapper[4941]: E0307 07:28:47.878188 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e3da26fc525facd023dd3f84a351cf3733535247b3de9214ba58c415fd97b4\": container with ID starting with c6e3da26fc525facd023dd3f84a351cf3733535247b3de9214ba58c415fd97b4 not found: ID does not exist" containerID="c6e3da26fc525facd023dd3f84a351cf3733535247b3de9214ba58c415fd97b4" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.878215 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e3da26fc525facd023dd3f84a351cf3733535247b3de9214ba58c415fd97b4"} err="failed to get container status \"c6e3da26fc525facd023dd3f84a351cf3733535247b3de9214ba58c415fd97b4\": rpc error: code = NotFound desc = could not find container \"c6e3da26fc525facd023dd3f84a351cf3733535247b3de9214ba58c415fd97b4\": container with ID starting with c6e3da26fc525facd023dd3f84a351cf3733535247b3de9214ba58c415fd97b4 not found: ID does not exist" Mar 07 07:28:47 crc kubenswrapper[4941]: I0307 07:28:47.966094 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d64bc0e3-30f7-4bcf-8348-b691bfed586b" path="/var/lib/kubelet/pods/d64bc0e3-30f7-4bcf-8348-b691bfed586b/volumes" Mar 07 07:29:10 crc kubenswrapper[4941]: I0307 07:29:10.315271 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:29:10 crc kubenswrapper[4941]: I0307 07:29:10.316173 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:29:40 crc kubenswrapper[4941]: I0307 07:29:40.314345 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:29:40 crc kubenswrapper[4941]: I0307 07:29:40.314966 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:29:40 crc kubenswrapper[4941]: I0307 07:29:40.315013 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 07:29:40 crc kubenswrapper[4941]: I0307 07:29:40.315618 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:29:40 crc kubenswrapper[4941]: I0307 07:29:40.315669 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" gracePeriod=600 Mar 07 07:29:40 crc kubenswrapper[4941]: E0307 07:29:40.448681 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:29:41 crc kubenswrapper[4941]: I0307 07:29:41.290276 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" exitCode=0 Mar 07 07:29:41 crc kubenswrapper[4941]: I0307 07:29:41.290540 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85"} Mar 07 07:29:41 crc kubenswrapper[4941]: I0307 07:29:41.290633 4941 scope.go:117] "RemoveContainer" containerID="7a88073c6303a571ed602b8eda7126dc414ea6424a5b0973cb131c83c8213e24" Mar 07 07:29:41 crc kubenswrapper[4941]: I0307 07:29:41.291251 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:29:41 crc kubenswrapper[4941]: E0307 07:29:41.291550 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:29:56 crc kubenswrapper[4941]: I0307 07:29:56.955275 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:29:56 crc kubenswrapper[4941]: E0307 07:29:56.956292 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.159106 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547810-v2bkm"] Mar 07 07:30:00 crc kubenswrapper[4941]: E0307 07:30:00.159672 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64bc0e3-30f7-4bcf-8348-b691bfed586b" containerName="registry-server" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.159683 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64bc0e3-30f7-4bcf-8348-b691bfed586b" containerName="registry-server" Mar 07 07:30:00 crc kubenswrapper[4941]: E0307 07:30:00.159699 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64bc0e3-30f7-4bcf-8348-b691bfed586b" containerName="extract-content" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.159704 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64bc0e3-30f7-4bcf-8348-b691bfed586b" containerName="extract-content" Mar 07 07:30:00 crc kubenswrapper[4941]: E0307 07:30:00.159720 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64bc0e3-30f7-4bcf-8348-b691bfed586b" containerName="extract-utilities" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.159727 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64bc0e3-30f7-4bcf-8348-b691bfed586b" containerName="extract-utilities" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.159892 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64bc0e3-30f7-4bcf-8348-b691bfed586b" containerName="registry-server" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.160370 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547810-v2bkm" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.162805 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.164132 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.166943 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.183248 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547810-v2bkm"] Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.198756 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdv9f\" (UniqueName: \"kubernetes.io/projected/27492f66-480f-47d3-bd80-f82120e0b598-kube-api-access-xdv9f\") pod \"auto-csr-approver-29547810-v2bkm\" (UID: \"27492f66-480f-47d3-bd80-f82120e0b598\") " pod="openshift-infra/auto-csr-approver-29547810-v2bkm" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.254557 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql"] Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.255376 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.257218 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.259589 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.263274 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql"] Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.299509 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmtnx\" (UniqueName: \"kubernetes.io/projected/d66ae703-5445-464e-9c39-533e57a6b3f3-kube-api-access-rmtnx\") pod \"collect-profiles-29547810-pfdql\" (UID: \"d66ae703-5445-464e-9c39-533e57a6b3f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.299552 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66ae703-5445-464e-9c39-533e57a6b3f3-config-volume\") pod \"collect-profiles-29547810-pfdql\" (UID: \"d66ae703-5445-464e-9c39-533e57a6b3f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.299583 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdv9f\" (UniqueName: \"kubernetes.io/projected/27492f66-480f-47d3-bd80-f82120e0b598-kube-api-access-xdv9f\") pod \"auto-csr-approver-29547810-v2bkm\" (UID: \"27492f66-480f-47d3-bd80-f82120e0b598\") " pod="openshift-infra/auto-csr-approver-29547810-v2bkm" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.299606 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66ae703-5445-464e-9c39-533e57a6b3f3-secret-volume\") pod \"collect-profiles-29547810-pfdql\" (UID: \"d66ae703-5445-464e-9c39-533e57a6b3f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.317067 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdv9f\" (UniqueName: \"kubernetes.io/projected/27492f66-480f-47d3-bd80-f82120e0b598-kube-api-access-xdv9f\") pod \"auto-csr-approver-29547810-v2bkm\" (UID: \"27492f66-480f-47d3-bd80-f82120e0b598\") " pod="openshift-infra/auto-csr-approver-29547810-v2bkm" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.400677 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmtnx\" (UniqueName: \"kubernetes.io/projected/d66ae703-5445-464e-9c39-533e57a6b3f3-kube-api-access-rmtnx\") pod \"collect-profiles-29547810-pfdql\" (UID: \"d66ae703-5445-464e-9c39-533e57a6b3f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.400759 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66ae703-5445-464e-9c39-533e57a6b3f3-config-volume\") pod \"collect-profiles-29547810-pfdql\" (UID: \"d66ae703-5445-464e-9c39-533e57a6b3f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.400824 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66ae703-5445-464e-9c39-533e57a6b3f3-secret-volume\") pod \"collect-profiles-29547810-pfdql\" (UID: \"d66ae703-5445-464e-9c39-533e57a6b3f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.402149 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66ae703-5445-464e-9c39-533e57a6b3f3-config-volume\") pod \"collect-profiles-29547810-pfdql\" (UID: \"d66ae703-5445-464e-9c39-533e57a6b3f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.406067 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66ae703-5445-464e-9c39-533e57a6b3f3-secret-volume\") pod \"collect-profiles-29547810-pfdql\" (UID: \"d66ae703-5445-464e-9c39-533e57a6b3f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.416790 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmtnx\" (UniqueName: \"kubernetes.io/projected/d66ae703-5445-464e-9c39-533e57a6b3f3-kube-api-access-rmtnx\") pod \"collect-profiles-29547810-pfdql\" (UID: \"d66ae703-5445-464e-9c39-533e57a6b3f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.490911 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547810-v2bkm" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.568894 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.899928 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547810-v2bkm"] Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.911583 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:30:00 crc kubenswrapper[4941]: I0307 07:30:00.984526 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql"] Mar 07 07:30:00 crc kubenswrapper[4941]: W0307 07:30:00.988089 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd66ae703_5445_464e_9c39_533e57a6b3f3.slice/crio-0be7c7926cf71b2179d9bb07f84d2ade1503cecf18d84f79c78954ffb4e9b0dc WatchSource:0}: Error finding container 0be7c7926cf71b2179d9bb07f84d2ade1503cecf18d84f79c78954ffb4e9b0dc: Status 404 returned error can't find the container with id 0be7c7926cf71b2179d9bb07f84d2ade1503cecf18d84f79c78954ffb4e9b0dc Mar 07 07:30:01 crc kubenswrapper[4941]: I0307 07:30:01.447696 4941 generic.go:334] "Generic (PLEG): container finished" podID="d66ae703-5445-464e-9c39-533e57a6b3f3" containerID="f5d45c9b37523e9e553e9ff611880840e9e18d4fc4618b79fd607c95a04f491b" exitCode=0 Mar 07 07:30:01 crc kubenswrapper[4941]: I0307 07:30:01.447786 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" event={"ID":"d66ae703-5445-464e-9c39-533e57a6b3f3","Type":"ContainerDied","Data":"f5d45c9b37523e9e553e9ff611880840e9e18d4fc4618b79fd607c95a04f491b"} Mar 07 07:30:01 crc kubenswrapper[4941]: I0307 07:30:01.448606 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" event={"ID":"d66ae703-5445-464e-9c39-533e57a6b3f3","Type":"ContainerStarted","Data":"0be7c7926cf71b2179d9bb07f84d2ade1503cecf18d84f79c78954ffb4e9b0dc"} Mar 07 07:30:01 crc kubenswrapper[4941]: I0307 07:30:01.450975 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547810-v2bkm" event={"ID":"27492f66-480f-47d3-bd80-f82120e0b598","Type":"ContainerStarted","Data":"f8ee7bd84f5c097fcbde28a6b1bc77578b584a62c38bcc513620794daa3fe06e"} Mar 07 07:30:02 crc kubenswrapper[4941]: I0307 07:30:02.802867 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" Mar 07 07:30:02 crc kubenswrapper[4941]: I0307 07:30:02.834834 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66ae703-5445-464e-9c39-533e57a6b3f3-config-volume\") pod \"d66ae703-5445-464e-9c39-533e57a6b3f3\" (UID: \"d66ae703-5445-464e-9c39-533e57a6b3f3\") " Mar 07 07:30:02 crc kubenswrapper[4941]: I0307 07:30:02.834924 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmtnx\" (UniqueName: \"kubernetes.io/projected/d66ae703-5445-464e-9c39-533e57a6b3f3-kube-api-access-rmtnx\") pod \"d66ae703-5445-464e-9c39-533e57a6b3f3\" (UID: \"d66ae703-5445-464e-9c39-533e57a6b3f3\") " Mar 07 07:30:02 crc kubenswrapper[4941]: I0307 07:30:02.834952 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66ae703-5445-464e-9c39-533e57a6b3f3-secret-volume\") pod \"d66ae703-5445-464e-9c39-533e57a6b3f3\" (UID: \"d66ae703-5445-464e-9c39-533e57a6b3f3\") " Mar 07 07:30:02 crc kubenswrapper[4941]: I0307 07:30:02.836451 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d66ae703-5445-464e-9c39-533e57a6b3f3-config-volume" (OuterVolumeSpecName: "config-volume") pod "d66ae703-5445-464e-9c39-533e57a6b3f3" (UID: "d66ae703-5445-464e-9c39-533e57a6b3f3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:30:02 crc kubenswrapper[4941]: I0307 07:30:02.842432 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66ae703-5445-464e-9c39-533e57a6b3f3-kube-api-access-rmtnx" (OuterVolumeSpecName: "kube-api-access-rmtnx") pod "d66ae703-5445-464e-9c39-533e57a6b3f3" (UID: "d66ae703-5445-464e-9c39-533e57a6b3f3"). InnerVolumeSpecName "kube-api-access-rmtnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:30:02 crc kubenswrapper[4941]: I0307 07:30:02.843357 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66ae703-5445-464e-9c39-533e57a6b3f3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d66ae703-5445-464e-9c39-533e57a6b3f3" (UID: "d66ae703-5445-464e-9c39-533e57a6b3f3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:30:02 crc kubenswrapper[4941]: I0307 07:30:02.937246 4941 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66ae703-5445-464e-9c39-533e57a6b3f3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:30:02 crc kubenswrapper[4941]: I0307 07:30:02.937295 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmtnx\" (UniqueName: \"kubernetes.io/projected/d66ae703-5445-464e-9c39-533e57a6b3f3-kube-api-access-rmtnx\") on node \"crc\" DevicePath \"\"" Mar 07 07:30:02 crc kubenswrapper[4941]: I0307 07:30:02.937310 4941 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66ae703-5445-464e-9c39-533e57a6b3f3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:30:03 crc kubenswrapper[4941]: I0307 07:30:03.480786 4941 generic.go:334] "Generic (PLEG): container finished" podID="27492f66-480f-47d3-bd80-f82120e0b598" containerID="a691d2ea9c8bf72fe742d98bc0737bfd9f314e60ca1cecba695617a6dc3e559b" exitCode=0 Mar 07 07:30:03 crc kubenswrapper[4941]: I0307 07:30:03.480838 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547810-v2bkm" event={"ID":"27492f66-480f-47d3-bd80-f82120e0b598","Type":"ContainerDied","Data":"a691d2ea9c8bf72fe742d98bc0737bfd9f314e60ca1cecba695617a6dc3e559b"} Mar 07 07:30:03 crc kubenswrapper[4941]: I0307 07:30:03.484733 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" event={"ID":"d66ae703-5445-464e-9c39-533e57a6b3f3","Type":"ContainerDied","Data":"0be7c7926cf71b2179d9bb07f84d2ade1503cecf18d84f79c78954ffb4e9b0dc"} Mar 07 07:30:03 crc kubenswrapper[4941]: I0307 07:30:03.484767 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0be7c7926cf71b2179d9bb07f84d2ade1503cecf18d84f79c78954ffb4e9b0dc" Mar 07 07:30:03 crc kubenswrapper[4941]: I0307 07:30:03.484790 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-pfdql" Mar 07 07:30:03 crc kubenswrapper[4941]: I0307 07:30:03.883926 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77"] Mar 07 07:30:03 crc kubenswrapper[4941]: I0307 07:30:03.890518 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547765-2wh77"] Mar 07 07:30:03 crc kubenswrapper[4941]: I0307 07:30:03.964182 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb" path="/var/lib/kubelet/pods/5b1d79bb-eb2e-4985-aad4-cd8a2ed067fb/volumes" Mar 07 07:30:04 crc kubenswrapper[4941]: I0307 07:30:04.835801 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547810-v2bkm" Mar 07 07:30:04 crc kubenswrapper[4941]: I0307 07:30:04.974523 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdv9f\" (UniqueName: \"kubernetes.io/projected/27492f66-480f-47d3-bd80-f82120e0b598-kube-api-access-xdv9f\") pod \"27492f66-480f-47d3-bd80-f82120e0b598\" (UID: \"27492f66-480f-47d3-bd80-f82120e0b598\") " Mar 07 07:30:04 crc kubenswrapper[4941]: I0307 07:30:04.979140 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27492f66-480f-47d3-bd80-f82120e0b598-kube-api-access-xdv9f" (OuterVolumeSpecName: "kube-api-access-xdv9f") pod "27492f66-480f-47d3-bd80-f82120e0b598" (UID: "27492f66-480f-47d3-bd80-f82120e0b598"). InnerVolumeSpecName "kube-api-access-xdv9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:30:05 crc kubenswrapper[4941]: I0307 07:30:05.075687 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdv9f\" (UniqueName: \"kubernetes.io/projected/27492f66-480f-47d3-bd80-f82120e0b598-kube-api-access-xdv9f\") on node \"crc\" DevicePath \"\"" Mar 07 07:30:05 crc kubenswrapper[4941]: I0307 07:30:05.504696 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547810-v2bkm" event={"ID":"27492f66-480f-47d3-bd80-f82120e0b598","Type":"ContainerDied","Data":"f8ee7bd84f5c097fcbde28a6b1bc77578b584a62c38bcc513620794daa3fe06e"} Mar 07 07:30:05 crc kubenswrapper[4941]: I0307 07:30:05.505382 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8ee7bd84f5c097fcbde28a6b1bc77578b584a62c38bcc513620794daa3fe06e" Mar 07 07:30:05 crc kubenswrapper[4941]: I0307 07:30:05.504784 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547810-v2bkm" Mar 07 07:30:05 crc kubenswrapper[4941]: I0307 07:30:05.888201 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547804-hkn5q"] Mar 07 07:30:05 crc kubenswrapper[4941]: I0307 07:30:05.893910 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547804-hkn5q"] Mar 07 07:30:05 crc kubenswrapper[4941]: I0307 07:30:05.963343 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="325ab554-4cc1-4530-b655-369b62854330" path="/var/lib/kubelet/pods/325ab554-4cc1-4530-b655-369b62854330/volumes" Mar 07 07:30:08 crc kubenswrapper[4941]: I0307 07:30:08.309660 4941 scope.go:117] "RemoveContainer" containerID="256668fde66ac50d195569767d11e7f22581a583b8cc45738819061a4e3737e2" Mar 07 07:30:08 crc kubenswrapper[4941]: I0307 07:30:08.328442 4941 scope.go:117] "RemoveContainer" containerID="83c353ce79f938e4dba9e0d39329ecd3a6eb6a52c9f3988ce15c5788ada3e8d0" Mar 07 07:30:09 crc kubenswrapper[4941]: I0307 07:30:09.955652 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:30:09 crc kubenswrapper[4941]: E0307 07:30:09.956471 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:30:23 crc kubenswrapper[4941]: I0307 07:30:23.964786 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:30:23 crc kubenswrapper[4941]: E0307 07:30:23.966158 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:30:29 crc kubenswrapper[4941]: I0307 07:30:29.880341 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-62fsk"] Mar 07 07:30:29 crc kubenswrapper[4941]: E0307 07:30:29.881743 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27492f66-480f-47d3-bd80-f82120e0b598" containerName="oc" Mar 07 07:30:29 crc kubenswrapper[4941]: I0307 07:30:29.881773 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="27492f66-480f-47d3-bd80-f82120e0b598" containerName="oc" Mar 07 07:30:29 crc kubenswrapper[4941]: E0307 07:30:29.881848 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66ae703-5445-464e-9c39-533e57a6b3f3" containerName="collect-profiles" Mar 07 07:30:29 crc kubenswrapper[4941]: I0307 07:30:29.881864 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66ae703-5445-464e-9c39-533e57a6b3f3" containerName="collect-profiles" Mar 07 07:30:29 crc kubenswrapper[4941]: I0307 07:30:29.882186 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="27492f66-480f-47d3-bd80-f82120e0b598" containerName="oc" Mar 07 07:30:29 crc kubenswrapper[4941]: I0307 07:30:29.882220 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66ae703-5445-464e-9c39-533e57a6b3f3" containerName="collect-profiles" Mar 07 07:30:29 crc kubenswrapper[4941]: I0307 07:30:29.885145 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:29 crc kubenswrapper[4941]: I0307 07:30:29.901270 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62fsk"] Mar 07 07:30:30 crc kubenswrapper[4941]: I0307 07:30:30.074015 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3419e79d-36ce-4713-b66c-631d3c78628f-catalog-content\") pod \"redhat-marketplace-62fsk\" (UID: \"3419e79d-36ce-4713-b66c-631d3c78628f\") " pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:30 crc kubenswrapper[4941]: I0307 07:30:30.074074 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3419e79d-36ce-4713-b66c-631d3c78628f-utilities\") pod \"redhat-marketplace-62fsk\" (UID: \"3419e79d-36ce-4713-b66c-631d3c78628f\") " pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:30 crc kubenswrapper[4941]: I0307 07:30:30.074211 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pz66\" (UniqueName: \"kubernetes.io/projected/3419e79d-36ce-4713-b66c-631d3c78628f-kube-api-access-9pz66\") pod \"redhat-marketplace-62fsk\" (UID: \"3419e79d-36ce-4713-b66c-631d3c78628f\") " pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:30 crc kubenswrapper[4941]: I0307 07:30:30.175643 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3419e79d-36ce-4713-b66c-631d3c78628f-catalog-content\") pod \"redhat-marketplace-62fsk\" (UID: \"3419e79d-36ce-4713-b66c-631d3c78628f\") " pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:30 crc kubenswrapper[4941]: I0307 07:30:30.175706 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3419e79d-36ce-4713-b66c-631d3c78628f-utilities\") pod \"redhat-marketplace-62fsk\" (UID: \"3419e79d-36ce-4713-b66c-631d3c78628f\") " pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:30 crc kubenswrapper[4941]: I0307 07:30:30.175759 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pz66\" (UniqueName: \"kubernetes.io/projected/3419e79d-36ce-4713-b66c-631d3c78628f-kube-api-access-9pz66\") pod \"redhat-marketplace-62fsk\" (UID: \"3419e79d-36ce-4713-b66c-631d3c78628f\") " pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:30 crc kubenswrapper[4941]: I0307 07:30:30.176146 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3419e79d-36ce-4713-b66c-631d3c78628f-catalog-content\") pod \"redhat-marketplace-62fsk\" (UID: \"3419e79d-36ce-4713-b66c-631d3c78628f\") " pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:30 crc kubenswrapper[4941]: I0307 07:30:30.176270 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3419e79d-36ce-4713-b66c-631d3c78628f-utilities\") pod \"redhat-marketplace-62fsk\" (UID: \"3419e79d-36ce-4713-b66c-631d3c78628f\") " pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:30 crc kubenswrapper[4941]: I0307 07:30:30.200953 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pz66\" (UniqueName: \"kubernetes.io/projected/3419e79d-36ce-4713-b66c-631d3c78628f-kube-api-access-9pz66\") pod \"redhat-marketplace-62fsk\" (UID: \"3419e79d-36ce-4713-b66c-631d3c78628f\") " pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:30 crc kubenswrapper[4941]: I0307 07:30:30.241270 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:30 crc kubenswrapper[4941]: I0307 07:30:30.674068 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62fsk"] Mar 07 07:30:30 crc kubenswrapper[4941]: I0307 07:30:30.724093 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62fsk" event={"ID":"3419e79d-36ce-4713-b66c-631d3c78628f","Type":"ContainerStarted","Data":"f7f7bee36d1351ea23bd1f73d54f67b635b8a841a3a467ac0a6c4003fb3e78a3"} Mar 07 07:30:31 crc kubenswrapper[4941]: I0307 07:30:31.740748 4941 generic.go:334] "Generic (PLEG): container finished" podID="3419e79d-36ce-4713-b66c-631d3c78628f" containerID="15f264fc23e943c65ef28808a47e7ecf09f61c2e32b5b3a0c5c216e9dc7875ad" exitCode=0 Mar 07 07:30:31 crc kubenswrapper[4941]: I0307 07:30:31.740819 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62fsk" event={"ID":"3419e79d-36ce-4713-b66c-631d3c78628f","Type":"ContainerDied","Data":"15f264fc23e943c65ef28808a47e7ecf09f61c2e32b5b3a0c5c216e9dc7875ad"} Mar 07 07:30:33 crc kubenswrapper[4941]: I0307 07:30:33.763373 4941 generic.go:334] "Generic (PLEG): container finished" podID="3419e79d-36ce-4713-b66c-631d3c78628f" containerID="ac4711516846cb53a94fa1524b8bc7447a1b6893f26df1dc59ccd4e604f025a5" exitCode=0 Mar 07 07:30:33 crc kubenswrapper[4941]: I0307 07:30:33.763467 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62fsk" event={"ID":"3419e79d-36ce-4713-b66c-631d3c78628f","Type":"ContainerDied","Data":"ac4711516846cb53a94fa1524b8bc7447a1b6893f26df1dc59ccd4e604f025a5"} Mar 07 07:30:34 crc kubenswrapper[4941]: I0307 07:30:34.775336 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62fsk" event={"ID":"3419e79d-36ce-4713-b66c-631d3c78628f","Type":"ContainerStarted","Data":"bdf50c80d3cd06b7143f9cca5885f921904b344c50dd68652fdd93e5db42a96b"} Mar 07 07:30:34 crc kubenswrapper[4941]: I0307 07:30:34.798256 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-62fsk" podStartSLOduration=3.3393484940000002 podStartE2EDuration="5.79823979s" podCreationTimestamp="2026-03-07 07:30:29 +0000 UTC" firstStartedPulling="2026-03-07 07:30:31.743448202 +0000 UTC m=+2328.695813667" lastFinishedPulling="2026-03-07 07:30:34.202339488 +0000 UTC m=+2331.154704963" observedRunningTime="2026-03-07 07:30:34.797447711 +0000 UTC m=+2331.749813196" watchObservedRunningTime="2026-03-07 07:30:34.79823979 +0000 UTC m=+2331.750605245" Mar 07 07:30:38 crc kubenswrapper[4941]: I0307 07:30:38.954136 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:30:38 crc kubenswrapper[4941]: E0307 07:30:38.954701 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:30:40 crc kubenswrapper[4941]: I0307 07:30:40.241688 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:40 crc kubenswrapper[4941]: I0307 07:30:40.242122 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:40 crc kubenswrapper[4941]: I0307 07:30:40.291912 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:40 crc kubenswrapper[4941]: I0307 07:30:40.893607 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:40 crc kubenswrapper[4941]: I0307 07:30:40.950583 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-62fsk"] Mar 07 07:30:42 crc kubenswrapper[4941]: I0307 07:30:42.840309 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-62fsk" podUID="3419e79d-36ce-4713-b66c-631d3c78628f" containerName="registry-server" containerID="cri-o://bdf50c80d3cd06b7143f9cca5885f921904b344c50dd68652fdd93e5db42a96b" gracePeriod=2 Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.280887 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.477646 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3419e79d-36ce-4713-b66c-631d3c78628f-utilities\") pod \"3419e79d-36ce-4713-b66c-631d3c78628f\" (UID: \"3419e79d-36ce-4713-b66c-631d3c78628f\") " Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.477762 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pz66\" (UniqueName: \"kubernetes.io/projected/3419e79d-36ce-4713-b66c-631d3c78628f-kube-api-access-9pz66\") pod \"3419e79d-36ce-4713-b66c-631d3c78628f\" (UID: \"3419e79d-36ce-4713-b66c-631d3c78628f\") " Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.477822 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3419e79d-36ce-4713-b66c-631d3c78628f-catalog-content\") pod \"3419e79d-36ce-4713-b66c-631d3c78628f\" (UID: \"3419e79d-36ce-4713-b66c-631d3c78628f\") " Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.479884 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3419e79d-36ce-4713-b66c-631d3c78628f-utilities" (OuterVolumeSpecName: "utilities") pod "3419e79d-36ce-4713-b66c-631d3c78628f" (UID: "3419e79d-36ce-4713-b66c-631d3c78628f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.487900 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3419e79d-36ce-4713-b66c-631d3c78628f-kube-api-access-9pz66" (OuterVolumeSpecName: "kube-api-access-9pz66") pod "3419e79d-36ce-4713-b66c-631d3c78628f" (UID: "3419e79d-36ce-4713-b66c-631d3c78628f"). InnerVolumeSpecName "kube-api-access-9pz66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.529350 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3419e79d-36ce-4713-b66c-631d3c78628f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3419e79d-36ce-4713-b66c-631d3c78628f" (UID: "3419e79d-36ce-4713-b66c-631d3c78628f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.579857 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pz66\" (UniqueName: \"kubernetes.io/projected/3419e79d-36ce-4713-b66c-631d3c78628f-kube-api-access-9pz66\") on node \"crc\" DevicePath \"\"" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.579904 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3419e79d-36ce-4713-b66c-631d3c78628f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.579920 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3419e79d-36ce-4713-b66c-631d3c78628f-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.852811 4941 generic.go:334] "Generic (PLEG): container finished" podID="3419e79d-36ce-4713-b66c-631d3c78628f" containerID="bdf50c80d3cd06b7143f9cca5885f921904b344c50dd68652fdd93e5db42a96b" exitCode=0 Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.852875 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62fsk" event={"ID":"3419e79d-36ce-4713-b66c-631d3c78628f","Type":"ContainerDied","Data":"bdf50c80d3cd06b7143f9cca5885f921904b344c50dd68652fdd93e5db42a96b"} Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.852913 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62fsk" event={"ID":"3419e79d-36ce-4713-b66c-631d3c78628f","Type":"ContainerDied","Data":"f7f7bee36d1351ea23bd1f73d54f67b635b8a841a3a467ac0a6c4003fb3e78a3"} Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.852940 4941 scope.go:117] "RemoveContainer" containerID="bdf50c80d3cd06b7143f9cca5885f921904b344c50dd68652fdd93e5db42a96b" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.852946 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62fsk" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.887201 4941 scope.go:117] "RemoveContainer" containerID="ac4711516846cb53a94fa1524b8bc7447a1b6893f26df1dc59ccd4e604f025a5" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.922141 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-62fsk"] Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.931201 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-62fsk"] Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.940359 4941 scope.go:117] "RemoveContainer" containerID="15f264fc23e943c65ef28808a47e7ecf09f61c2e32b5b3a0c5c216e9dc7875ad" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.962281 4941 scope.go:117] "RemoveContainer" containerID="bdf50c80d3cd06b7143f9cca5885f921904b344c50dd68652fdd93e5db42a96b" Mar 07 07:30:43 crc kubenswrapper[4941]: E0307 07:30:43.962921 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf50c80d3cd06b7143f9cca5885f921904b344c50dd68652fdd93e5db42a96b\": container with ID starting with bdf50c80d3cd06b7143f9cca5885f921904b344c50dd68652fdd93e5db42a96b not found: ID does not exist" containerID="bdf50c80d3cd06b7143f9cca5885f921904b344c50dd68652fdd93e5db42a96b" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.963002 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf50c80d3cd06b7143f9cca5885f921904b344c50dd68652fdd93e5db42a96b"} err="failed to get container status \"bdf50c80d3cd06b7143f9cca5885f921904b344c50dd68652fdd93e5db42a96b\": rpc error: code = NotFound desc = could not find container \"bdf50c80d3cd06b7143f9cca5885f921904b344c50dd68652fdd93e5db42a96b\": container with ID starting with bdf50c80d3cd06b7143f9cca5885f921904b344c50dd68652fdd93e5db42a96b not found: ID does not exist" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.963065 4941 scope.go:117] "RemoveContainer" containerID="ac4711516846cb53a94fa1524b8bc7447a1b6893f26df1dc59ccd4e604f025a5" Mar 07 07:30:43 crc kubenswrapper[4941]: E0307 07:30:43.963508 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac4711516846cb53a94fa1524b8bc7447a1b6893f26df1dc59ccd4e604f025a5\": container with ID starting with ac4711516846cb53a94fa1524b8bc7447a1b6893f26df1dc59ccd4e604f025a5 not found: ID does not exist" containerID="ac4711516846cb53a94fa1524b8bc7447a1b6893f26df1dc59ccd4e604f025a5" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.963546 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac4711516846cb53a94fa1524b8bc7447a1b6893f26df1dc59ccd4e604f025a5"} err="failed to get container status \"ac4711516846cb53a94fa1524b8bc7447a1b6893f26df1dc59ccd4e604f025a5\": rpc error: code = NotFound desc = could not find container \"ac4711516846cb53a94fa1524b8bc7447a1b6893f26df1dc59ccd4e604f025a5\": container with ID starting with ac4711516846cb53a94fa1524b8bc7447a1b6893f26df1dc59ccd4e604f025a5 not found: ID does not exist" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.963593 4941 scope.go:117] "RemoveContainer" containerID="15f264fc23e943c65ef28808a47e7ecf09f61c2e32b5b3a0c5c216e9dc7875ad" Mar 07 07:30:43 crc kubenswrapper[4941]: E0307 07:30:43.963928 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15f264fc23e943c65ef28808a47e7ecf09f61c2e32b5b3a0c5c216e9dc7875ad\": container with ID starting with 15f264fc23e943c65ef28808a47e7ecf09f61c2e32b5b3a0c5c216e9dc7875ad not found: ID does not exist" containerID="15f264fc23e943c65ef28808a47e7ecf09f61c2e32b5b3a0c5c216e9dc7875ad" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.963959 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15f264fc23e943c65ef28808a47e7ecf09f61c2e32b5b3a0c5c216e9dc7875ad"} err="failed to get container status \"15f264fc23e943c65ef28808a47e7ecf09f61c2e32b5b3a0c5c216e9dc7875ad\": rpc error: code = NotFound desc = could not find container \"15f264fc23e943c65ef28808a47e7ecf09f61c2e32b5b3a0c5c216e9dc7875ad\": container with ID starting with 15f264fc23e943c65ef28808a47e7ecf09f61c2e32b5b3a0c5c216e9dc7875ad not found: ID does not exist" Mar 07 07:30:43 crc kubenswrapper[4941]: I0307 07:30:43.968397 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3419e79d-36ce-4713-b66c-631d3c78628f" path="/var/lib/kubelet/pods/3419e79d-36ce-4713-b66c-631d3c78628f/volumes" Mar 07 07:30:50 crc kubenswrapper[4941]: I0307 07:30:50.955021 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:30:50 crc kubenswrapper[4941]: E0307 07:30:50.955945 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:31:04 crc kubenswrapper[4941]: I0307 07:31:04.954910 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:31:04 crc kubenswrapper[4941]: E0307 07:31:04.956096 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:31:18 crc kubenswrapper[4941]: I0307 07:31:18.955109 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:31:18 crc kubenswrapper[4941]: E0307 07:31:18.956202 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:31:30 crc kubenswrapper[4941]: I0307 07:31:30.966837 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:31:30 crc kubenswrapper[4941]: E0307 07:31:30.967821 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:31:45 crc kubenswrapper[4941]: I0307 07:31:45.956547 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:31:45 crc kubenswrapper[4941]: E0307 07:31:45.958760 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:31:56 crc kubenswrapper[4941]: I0307 07:31:56.954991 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:31:56 crc kubenswrapper[4941]: E0307 07:31:56.955715 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.164537 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547812-qrf7b"] Mar 07 07:32:00 crc kubenswrapper[4941]: E0307 07:32:00.165182 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3419e79d-36ce-4713-b66c-631d3c78628f" containerName="registry-server" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.165196 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="3419e79d-36ce-4713-b66c-631d3c78628f" containerName="registry-server" Mar 07 07:32:00 crc kubenswrapper[4941]: E0307 07:32:00.165213 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3419e79d-36ce-4713-b66c-631d3c78628f" containerName="extract-utilities" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.165222 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="3419e79d-36ce-4713-b66c-631d3c78628f" containerName="extract-utilities" Mar 07 07:32:00 crc kubenswrapper[4941]: E0307 07:32:00.165251 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3419e79d-36ce-4713-b66c-631d3c78628f" containerName="extract-content" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.165260 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="3419e79d-36ce-4713-b66c-631d3c78628f" containerName="extract-content" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.165445 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="3419e79d-36ce-4713-b66c-631d3c78628f" containerName="registry-server" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.166032 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547812-qrf7b" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.169765 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.170457 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.171652 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.182271 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547812-qrf7b"] Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.342395 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj2kv\" (UniqueName: \"kubernetes.io/projected/d39bd1fe-6d59-4d98-96a6-8e80db39b490-kube-api-access-zj2kv\") pod \"auto-csr-approver-29547812-qrf7b\" (UID: \"d39bd1fe-6d59-4d98-96a6-8e80db39b490\") " pod="openshift-infra/auto-csr-approver-29547812-qrf7b" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.444257 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj2kv\" (UniqueName: \"kubernetes.io/projected/d39bd1fe-6d59-4d98-96a6-8e80db39b490-kube-api-access-zj2kv\") pod \"auto-csr-approver-29547812-qrf7b\" (UID: \"d39bd1fe-6d59-4d98-96a6-8e80db39b490\") " pod="openshift-infra/auto-csr-approver-29547812-qrf7b" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.473077 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj2kv\" (UniqueName: \"kubernetes.io/projected/d39bd1fe-6d59-4d98-96a6-8e80db39b490-kube-api-access-zj2kv\") pod \"auto-csr-approver-29547812-qrf7b\" (UID: \"d39bd1fe-6d59-4d98-96a6-8e80db39b490\") " pod="openshift-infra/auto-csr-approver-29547812-qrf7b" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.487762 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547812-qrf7b" Mar 07 07:32:00 crc kubenswrapper[4941]: I0307 07:32:00.933141 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547812-qrf7b"] Mar 07 07:32:00 crc kubenswrapper[4941]: W0307 07:32:00.947977 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd39bd1fe_6d59_4d98_96a6_8e80db39b490.slice/crio-1ada4f0045888dcd4489df49ebcc60058f6ebd9fa8e42219f36b106296e09a77 WatchSource:0}: Error finding container 1ada4f0045888dcd4489df49ebcc60058f6ebd9fa8e42219f36b106296e09a77: Status 404 returned error can't find the container with id 1ada4f0045888dcd4489df49ebcc60058f6ebd9fa8e42219f36b106296e09a77 Mar 07 07:32:01 crc kubenswrapper[4941]: I0307 07:32:01.494742 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547812-qrf7b" event={"ID":"d39bd1fe-6d59-4d98-96a6-8e80db39b490","Type":"ContainerStarted","Data":"1ada4f0045888dcd4489df49ebcc60058f6ebd9fa8e42219f36b106296e09a77"} Mar 07 07:32:03 crc kubenswrapper[4941]: I0307 07:32:03.511782 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547812-qrf7b" event={"ID":"d39bd1fe-6d59-4d98-96a6-8e80db39b490","Type":"ContainerStarted","Data":"54c4abff890da20264731abb5419cee796457fe5256455e4855bb09e0728050c"} Mar 07 07:32:03 crc kubenswrapper[4941]: I0307 07:32:03.529958 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547812-qrf7b" podStartSLOduration=1.66404935 podStartE2EDuration="3.529940027s" podCreationTimestamp="2026-03-07 07:32:00 +0000 UTC" firstStartedPulling="2026-03-07 07:32:00.951329215 +0000 UTC m=+2417.903694690" lastFinishedPulling="2026-03-07 07:32:02.817219882 +0000 UTC m=+2419.769585367" observedRunningTime="2026-03-07 07:32:03.524867732 +0000 UTC m=+2420.477233207" watchObservedRunningTime="2026-03-07 07:32:03.529940027 +0000 UTC m=+2420.482305502" Mar 07 07:32:04 crc kubenswrapper[4941]: I0307 07:32:04.523182 4941 generic.go:334] "Generic (PLEG): container finished" podID="d39bd1fe-6d59-4d98-96a6-8e80db39b490" containerID="54c4abff890da20264731abb5419cee796457fe5256455e4855bb09e0728050c" exitCode=0 Mar 07 07:32:04 crc kubenswrapper[4941]: I0307 07:32:04.523381 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547812-qrf7b" event={"ID":"d39bd1fe-6d59-4d98-96a6-8e80db39b490","Type":"ContainerDied","Data":"54c4abff890da20264731abb5419cee796457fe5256455e4855bb09e0728050c"} Mar 07 07:32:05 crc kubenswrapper[4941]: I0307 07:32:05.810837 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547812-qrf7b" Mar 07 07:32:05 crc kubenswrapper[4941]: I0307 07:32:05.936947 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj2kv\" (UniqueName: \"kubernetes.io/projected/d39bd1fe-6d59-4d98-96a6-8e80db39b490-kube-api-access-zj2kv\") pod \"d39bd1fe-6d59-4d98-96a6-8e80db39b490\" (UID: \"d39bd1fe-6d59-4d98-96a6-8e80db39b490\") " Mar 07 07:32:05 crc kubenswrapper[4941]: I0307 07:32:05.942503 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d39bd1fe-6d59-4d98-96a6-8e80db39b490-kube-api-access-zj2kv" (OuterVolumeSpecName: "kube-api-access-zj2kv") pod "d39bd1fe-6d59-4d98-96a6-8e80db39b490" (UID: "d39bd1fe-6d59-4d98-96a6-8e80db39b490"). InnerVolumeSpecName "kube-api-access-zj2kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:32:06 crc kubenswrapper[4941]: I0307 07:32:06.038721 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj2kv\" (UniqueName: \"kubernetes.io/projected/d39bd1fe-6d59-4d98-96a6-8e80db39b490-kube-api-access-zj2kv\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:06 crc kubenswrapper[4941]: I0307 07:32:06.538850 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547812-qrf7b" event={"ID":"d39bd1fe-6d59-4d98-96a6-8e80db39b490","Type":"ContainerDied","Data":"1ada4f0045888dcd4489df49ebcc60058f6ebd9fa8e42219f36b106296e09a77"} Mar 07 07:32:06 crc kubenswrapper[4941]: I0307 07:32:06.538894 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ada4f0045888dcd4489df49ebcc60058f6ebd9fa8e42219f36b106296e09a77" Mar 07 07:32:06 crc kubenswrapper[4941]: I0307 07:32:06.539004 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547812-qrf7b" Mar 07 07:32:06 crc kubenswrapper[4941]: I0307 07:32:06.617538 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547806-2ktwf"] Mar 07 07:32:06 crc kubenswrapper[4941]: I0307 07:32:06.625452 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547806-2ktwf"] Mar 07 07:32:07 crc kubenswrapper[4941]: I0307 07:32:07.959766 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:32:07 crc kubenswrapper[4941]: E0307 07:32:07.960190 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:32:07 crc kubenswrapper[4941]: I0307 07:32:07.966763 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e971412-547e-4b69-b4e7-6b3f3081eb92" path="/var/lib/kubelet/pods/1e971412-547e-4b69-b4e7-6b3f3081eb92/volumes" Mar 07 07:32:08 crc kubenswrapper[4941]: I0307 07:32:08.439024 4941 scope.go:117] "RemoveContainer" containerID="46e9934c7c5e526698254db7a094fad5839752773304a0c53226f0e2a3986e83" Mar 07 07:32:18 crc kubenswrapper[4941]: I0307 07:32:18.954845 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:32:18 crc kubenswrapper[4941]: E0307 07:32:18.955828 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:32:33 crc kubenswrapper[4941]: I0307 07:32:33.965686 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:32:33 crc kubenswrapper[4941]: E0307 07:32:33.967871 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:32:45 crc kubenswrapper[4941]: I0307 07:32:45.954932 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:32:45 crc kubenswrapper[4941]: E0307 07:32:45.956064 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:32:58 crc kubenswrapper[4941]: I0307 07:32:58.956076 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:32:58 crc kubenswrapper[4941]: E0307 07:32:58.957368 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:33:10 crc kubenswrapper[4941]: I0307 07:33:10.954852 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:33:10 crc kubenswrapper[4941]: E0307 07:33:10.955626 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:33:22 crc kubenswrapper[4941]: I0307 07:33:22.955167 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:33:22 crc kubenswrapper[4941]: E0307 07:33:22.956178 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:33:37 crc kubenswrapper[4941]: I0307 07:33:37.955007 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:33:37 crc kubenswrapper[4941]: E0307 07:33:37.956120 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:33:48 crc kubenswrapper[4941]: I0307 07:33:48.955117 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:33:48 crc kubenswrapper[4941]: E0307 07:33:48.956062 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:34:00 crc kubenswrapper[4941]: I0307 07:34:00.154778 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547814-22gtq"] Mar 07 07:34:00 crc kubenswrapper[4941]: E0307 07:34:00.155916 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d39bd1fe-6d59-4d98-96a6-8e80db39b490" containerName="oc" Mar 07 07:34:00 crc kubenswrapper[4941]: I0307 07:34:00.155944 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39bd1fe-6d59-4d98-96a6-8e80db39b490" containerName="oc" Mar 07 07:34:00 crc kubenswrapper[4941]: I0307 07:34:00.156264 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="d39bd1fe-6d59-4d98-96a6-8e80db39b490" containerName="oc" Mar 07 07:34:00 crc kubenswrapper[4941]: I0307 07:34:00.157264 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547814-22gtq" Mar 07 07:34:00 crc kubenswrapper[4941]: I0307 07:34:00.159631 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:34:00 crc kubenswrapper[4941]: I0307 07:34:00.160004 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:34:00 crc kubenswrapper[4941]: I0307 07:34:00.160420 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:34:00 crc kubenswrapper[4941]: I0307 07:34:00.168162 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547814-22gtq"] Mar 07 07:34:00 crc kubenswrapper[4941]: I0307 07:34:00.280969 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brgkv\" (UniqueName: \"kubernetes.io/projected/541b3f6c-3eac-496e-a9cc-09e1834da93d-kube-api-access-brgkv\") pod \"auto-csr-approver-29547814-22gtq\" (UID: \"541b3f6c-3eac-496e-a9cc-09e1834da93d\") " pod="openshift-infra/auto-csr-approver-29547814-22gtq" Mar 07 07:34:00 crc kubenswrapper[4941]: I0307 07:34:00.384681 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brgkv\" (UniqueName: \"kubernetes.io/projected/541b3f6c-3eac-496e-a9cc-09e1834da93d-kube-api-access-brgkv\") pod \"auto-csr-approver-29547814-22gtq\" (UID: \"541b3f6c-3eac-496e-a9cc-09e1834da93d\") " pod="openshift-infra/auto-csr-approver-29547814-22gtq" Mar 07 07:34:00 crc kubenswrapper[4941]: I0307 07:34:00.419966 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brgkv\" (UniqueName: \"kubernetes.io/projected/541b3f6c-3eac-496e-a9cc-09e1834da93d-kube-api-access-brgkv\") pod \"auto-csr-approver-29547814-22gtq\" (UID: \"541b3f6c-3eac-496e-a9cc-09e1834da93d\") " pod="openshift-infra/auto-csr-approver-29547814-22gtq" Mar 07 07:34:00 crc kubenswrapper[4941]: I0307 07:34:00.480587 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547814-22gtq" Mar 07 07:34:00 crc kubenswrapper[4941]: I0307 07:34:00.964088 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547814-22gtq"] Mar 07 07:34:01 crc kubenswrapper[4941]: I0307 07:34:01.530265 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547814-22gtq" event={"ID":"541b3f6c-3eac-496e-a9cc-09e1834da93d","Type":"ContainerStarted","Data":"78914eee6609195a5a52987d904477c81fb2db25c8fadc3367d77f8149543d84"} Mar 07 07:34:01 crc kubenswrapper[4941]: I0307 07:34:01.954996 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:34:01 crc kubenswrapper[4941]: E0307 07:34:01.955608 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:34:04 crc kubenswrapper[4941]: I0307 07:34:04.605732 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547814-22gtq" event={"ID":"541b3f6c-3eac-496e-a9cc-09e1834da93d","Type":"ContainerStarted","Data":"cfaa09cf2964886758e522d378a75f62cf371319b31ba721cfe6960223f0ed3a"} Mar 07 07:34:04 crc kubenswrapper[4941]: I0307 07:34:04.624617 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547814-22gtq" podStartSLOduration=1.6251511760000001 podStartE2EDuration="4.62459829s" podCreationTimestamp="2026-03-07 07:34:00 +0000 UTC" firstStartedPulling="2026-03-07 07:34:00.972697925 +0000 UTC m=+2537.925063400" lastFinishedPulling="2026-03-07 07:34:03.972145029 +0000 UTC m=+2540.924510514" observedRunningTime="2026-03-07 07:34:04.621645718 +0000 UTC m=+2541.574011173" watchObservedRunningTime="2026-03-07 07:34:04.62459829 +0000 UTC m=+2541.576963755" Mar 07 07:34:05 crc kubenswrapper[4941]: I0307 07:34:05.626229 4941 generic.go:334] "Generic (PLEG): container finished" podID="541b3f6c-3eac-496e-a9cc-09e1834da93d" containerID="cfaa09cf2964886758e522d378a75f62cf371319b31ba721cfe6960223f0ed3a" exitCode=0 Mar 07 07:34:05 crc kubenswrapper[4941]: I0307 07:34:05.626434 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547814-22gtq" event={"ID":"541b3f6c-3eac-496e-a9cc-09e1834da93d","Type":"ContainerDied","Data":"cfaa09cf2964886758e522d378a75f62cf371319b31ba721cfe6960223f0ed3a"} Mar 07 07:34:06 crc kubenswrapper[4941]: I0307 07:34:06.931364 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547814-22gtq" Mar 07 07:34:07 crc kubenswrapper[4941]: I0307 07:34:07.039225 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547808-br5ss"] Mar 07 07:34:07 crc kubenswrapper[4941]: I0307 07:34:07.043718 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547808-br5ss"] Mar 07 07:34:07 crc kubenswrapper[4941]: I0307 07:34:07.128293 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brgkv\" (UniqueName: \"kubernetes.io/projected/541b3f6c-3eac-496e-a9cc-09e1834da93d-kube-api-access-brgkv\") pod \"541b3f6c-3eac-496e-a9cc-09e1834da93d\" (UID: \"541b3f6c-3eac-496e-a9cc-09e1834da93d\") " Mar 07 07:34:07 crc kubenswrapper[4941]: I0307 07:34:07.132877 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541b3f6c-3eac-496e-a9cc-09e1834da93d-kube-api-access-brgkv" (OuterVolumeSpecName: "kube-api-access-brgkv") pod "541b3f6c-3eac-496e-a9cc-09e1834da93d" (UID: "541b3f6c-3eac-496e-a9cc-09e1834da93d"). InnerVolumeSpecName "kube-api-access-brgkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:34:07 crc kubenswrapper[4941]: I0307 07:34:07.230343 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brgkv\" (UniqueName: \"kubernetes.io/projected/541b3f6c-3eac-496e-a9cc-09e1834da93d-kube-api-access-brgkv\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:07 crc kubenswrapper[4941]: I0307 07:34:07.644317 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547814-22gtq" event={"ID":"541b3f6c-3eac-496e-a9cc-09e1834da93d","Type":"ContainerDied","Data":"78914eee6609195a5a52987d904477c81fb2db25c8fadc3367d77f8149543d84"} Mar 07 07:34:07 crc kubenswrapper[4941]: I0307 07:34:07.644361 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78914eee6609195a5a52987d904477c81fb2db25c8fadc3367d77f8149543d84" Mar 07 07:34:07 crc kubenswrapper[4941]: I0307 07:34:07.644436 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547814-22gtq" Mar 07 07:34:07 crc kubenswrapper[4941]: I0307 07:34:07.968746 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d6549ed-e6d7-409f-9a6d-995abd2a46a4" path="/var/lib/kubelet/pods/0d6549ed-e6d7-409f-9a6d-995abd2a46a4/volumes" Mar 07 07:34:08 crc kubenswrapper[4941]: I0307 07:34:08.526479 4941 scope.go:117] "RemoveContainer" containerID="893dfb60219c187fb5841ab7f2877c6e849082f254afe142c0b4675c00c8f782" Mar 07 07:34:14 crc kubenswrapper[4941]: I0307 07:34:14.954126 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:34:14 crc kubenswrapper[4941]: E0307 07:34:14.954567 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:34:26 crc kubenswrapper[4941]: I0307 07:34:26.954828 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:34:26 crc kubenswrapper[4941]: E0307 07:34:26.955623 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:34:39 crc kubenswrapper[4941]: I0307 07:34:39.955112 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:34:39 crc kubenswrapper[4941]: E0307 07:34:39.956079 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:34:54 crc kubenswrapper[4941]: I0307 07:34:54.954909 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:34:56 crc kubenswrapper[4941]: I0307 07:34:56.051932 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"88ca66b5a032f60fea483bfc96920dae763c35b858a0aa50b1a11df5e9201ea8"} Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.170872 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547816-wbc9b"] Mar 07 07:36:00 crc kubenswrapper[4941]: E0307 07:36:00.172698 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541b3f6c-3eac-496e-a9cc-09e1834da93d" containerName="oc" Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.172736 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="541b3f6c-3eac-496e-a9cc-09e1834da93d" containerName="oc" Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.173118 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="541b3f6c-3eac-496e-a9cc-09e1834da93d" containerName="oc" Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.174167 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547816-wbc9b" Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.176267 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.176271 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.176490 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.186816 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547816-wbc9b"] Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.276787 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4bgg\" (UniqueName: \"kubernetes.io/projected/29b8a10c-855b-46f4-844e-8e9fbb15ad39-kube-api-access-j4bgg\") pod \"auto-csr-approver-29547816-wbc9b\" (UID: \"29b8a10c-855b-46f4-844e-8e9fbb15ad39\") " pod="openshift-infra/auto-csr-approver-29547816-wbc9b" Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.377771 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4bgg\" (UniqueName: \"kubernetes.io/projected/29b8a10c-855b-46f4-844e-8e9fbb15ad39-kube-api-access-j4bgg\") pod \"auto-csr-approver-29547816-wbc9b\" (UID: \"29b8a10c-855b-46f4-844e-8e9fbb15ad39\") " pod="openshift-infra/auto-csr-approver-29547816-wbc9b" Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.402485 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4bgg\" (UniqueName: \"kubernetes.io/projected/29b8a10c-855b-46f4-844e-8e9fbb15ad39-kube-api-access-j4bgg\") pod \"auto-csr-approver-29547816-wbc9b\" (UID: \"29b8a10c-855b-46f4-844e-8e9fbb15ad39\") " pod="openshift-infra/auto-csr-approver-29547816-wbc9b" Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.509129 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547816-wbc9b" Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.943376 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547816-wbc9b"] Mar 07 07:36:00 crc kubenswrapper[4941]: I0307 07:36:00.956454 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:36:01 crc kubenswrapper[4941]: I0307 07:36:01.606797 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547816-wbc9b" event={"ID":"29b8a10c-855b-46f4-844e-8e9fbb15ad39","Type":"ContainerStarted","Data":"02f606fb7ee5a4ecc59ecea46efcb1834c4709a8374155a2c77635949e5f304f"} Mar 07 07:36:02 crc kubenswrapper[4941]: I0307 07:36:02.616707 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547816-wbc9b" event={"ID":"29b8a10c-855b-46f4-844e-8e9fbb15ad39","Type":"ContainerStarted","Data":"16d3a4e7365a953536360e02b7a6f43b5b8ca5e991327c85c29675898aa5340e"} Mar 07 07:36:02 crc kubenswrapper[4941]: I0307 07:36:02.633699 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547816-wbc9b" podStartSLOduration=1.617717956 podStartE2EDuration="2.633680149s" podCreationTimestamp="2026-03-07 07:36:00 +0000 UTC" firstStartedPulling="2026-03-07 07:36:00.956186043 +0000 UTC m=+2657.908551498" lastFinishedPulling="2026-03-07 07:36:01.972148226 +0000 UTC m=+2658.924513691" observedRunningTime="2026-03-07 07:36:02.630078131 +0000 UTC m=+2659.582443596" watchObservedRunningTime="2026-03-07 07:36:02.633680149 +0000 UTC m=+2659.586045614" Mar 07 07:36:03 crc kubenswrapper[4941]: I0307 07:36:03.624165 4941 generic.go:334] "Generic (PLEG): container finished" podID="29b8a10c-855b-46f4-844e-8e9fbb15ad39" containerID="16d3a4e7365a953536360e02b7a6f43b5b8ca5e991327c85c29675898aa5340e" exitCode=0 Mar 07 07:36:03 crc kubenswrapper[4941]: I0307 07:36:03.624208 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547816-wbc9b" event={"ID":"29b8a10c-855b-46f4-844e-8e9fbb15ad39","Type":"ContainerDied","Data":"16d3a4e7365a953536360e02b7a6f43b5b8ca5e991327c85c29675898aa5340e"} Mar 07 07:36:04 crc kubenswrapper[4941]: I0307 07:36:04.916858 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547816-wbc9b" Mar 07 07:36:04 crc kubenswrapper[4941]: I0307 07:36:04.955531 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4bgg\" (UniqueName: \"kubernetes.io/projected/29b8a10c-855b-46f4-844e-8e9fbb15ad39-kube-api-access-j4bgg\") pod \"29b8a10c-855b-46f4-844e-8e9fbb15ad39\" (UID: \"29b8a10c-855b-46f4-844e-8e9fbb15ad39\") " Mar 07 07:36:04 crc kubenswrapper[4941]: I0307 07:36:04.962463 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b8a10c-855b-46f4-844e-8e9fbb15ad39-kube-api-access-j4bgg" (OuterVolumeSpecName: "kube-api-access-j4bgg") pod "29b8a10c-855b-46f4-844e-8e9fbb15ad39" (UID: "29b8a10c-855b-46f4-844e-8e9fbb15ad39"). InnerVolumeSpecName "kube-api-access-j4bgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:05 crc kubenswrapper[4941]: I0307 07:36:05.056534 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4bgg\" (UniqueName: \"kubernetes.io/projected/29b8a10c-855b-46f4-844e-8e9fbb15ad39-kube-api-access-j4bgg\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:05 crc kubenswrapper[4941]: I0307 07:36:05.649246 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547816-wbc9b" event={"ID":"29b8a10c-855b-46f4-844e-8e9fbb15ad39","Type":"ContainerDied","Data":"02f606fb7ee5a4ecc59ecea46efcb1834c4709a8374155a2c77635949e5f304f"} Mar 07 07:36:05 crc kubenswrapper[4941]: I0307 07:36:05.649301 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02f606fb7ee5a4ecc59ecea46efcb1834c4709a8374155a2c77635949e5f304f" Mar 07 07:36:05 crc kubenswrapper[4941]: I0307 07:36:05.649365 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547816-wbc9b" Mar 07 07:36:05 crc kubenswrapper[4941]: I0307 07:36:05.714712 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547810-v2bkm"] Mar 07 07:36:05 crc kubenswrapper[4941]: I0307 07:36:05.721005 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547810-v2bkm"] Mar 07 07:36:05 crc kubenswrapper[4941]: I0307 07:36:05.968614 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27492f66-480f-47d3-bd80-f82120e0b598" path="/var/lib/kubelet/pods/27492f66-480f-47d3-bd80-f82120e0b598/volumes" Mar 07 07:36:08 crc kubenswrapper[4941]: I0307 07:36:08.607181 4941 scope.go:117] "RemoveContainer" containerID="a691d2ea9c8bf72fe742d98bc0737bfd9f314e60ca1cecba695617a6dc3e559b" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.183199 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4vnct"] Mar 07 07:36:21 crc kubenswrapper[4941]: E0307 07:36:21.184038 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b8a10c-855b-46f4-844e-8e9fbb15ad39" containerName="oc" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.184054 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b8a10c-855b-46f4-844e-8e9fbb15ad39" containerName="oc" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.184229 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="29b8a10c-855b-46f4-844e-8e9fbb15ad39" containerName="oc" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.185270 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.192333 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4vnct"] Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.267923 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafc086d-3932-4f4e-95e1-02fbef009d6a-utilities\") pod \"certified-operators-4vnct\" (UID: \"bafc086d-3932-4f4e-95e1-02fbef009d6a\") " pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.268194 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafc086d-3932-4f4e-95e1-02fbef009d6a-catalog-content\") pod \"certified-operators-4vnct\" (UID: \"bafc086d-3932-4f4e-95e1-02fbef009d6a\") " pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.268322 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xpk7\" (UniqueName: \"kubernetes.io/projected/bafc086d-3932-4f4e-95e1-02fbef009d6a-kube-api-access-4xpk7\") pod \"certified-operators-4vnct\" (UID: \"bafc086d-3932-4f4e-95e1-02fbef009d6a\") " pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.369394 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafc086d-3932-4f4e-95e1-02fbef009d6a-utilities\") pod \"certified-operators-4vnct\" (UID: \"bafc086d-3932-4f4e-95e1-02fbef009d6a\") " pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.369483 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafc086d-3932-4f4e-95e1-02fbef009d6a-catalog-content\") pod \"certified-operators-4vnct\" (UID: \"bafc086d-3932-4f4e-95e1-02fbef009d6a\") " pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.369508 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xpk7\" (UniqueName: \"kubernetes.io/projected/bafc086d-3932-4f4e-95e1-02fbef009d6a-kube-api-access-4xpk7\") pod \"certified-operators-4vnct\" (UID: \"bafc086d-3932-4f4e-95e1-02fbef009d6a\") " pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.370571 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafc086d-3932-4f4e-95e1-02fbef009d6a-catalog-content\") pod \"certified-operators-4vnct\" (UID: \"bafc086d-3932-4f4e-95e1-02fbef009d6a\") " pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.370630 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafc086d-3932-4f4e-95e1-02fbef009d6a-utilities\") pod \"certified-operators-4vnct\" (UID: \"bafc086d-3932-4f4e-95e1-02fbef009d6a\") " pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.389702 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xpk7\" (UniqueName: \"kubernetes.io/projected/bafc086d-3932-4f4e-95e1-02fbef009d6a-kube-api-access-4xpk7\") pod \"certified-operators-4vnct\" (UID: \"bafc086d-3932-4f4e-95e1-02fbef009d6a\") " pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:21 crc kubenswrapper[4941]: I0307 07:36:21.554533 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:22 crc kubenswrapper[4941]: I0307 07:36:22.051502 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4vnct"] Mar 07 07:36:22 crc kubenswrapper[4941]: I0307 07:36:22.824448 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vnct" event={"ID":"bafc086d-3932-4f4e-95e1-02fbef009d6a","Type":"ContainerStarted","Data":"2018965fffb55832739d6497e53d560d06a3cb72e0bf241a40a13e13dcd05040"} Mar 07 07:36:23 crc kubenswrapper[4941]: I0307 07:36:23.835969 4941 generic.go:334] "Generic (PLEG): container finished" podID="bafc086d-3932-4f4e-95e1-02fbef009d6a" containerID="478e029b11249e2cc881453a5e3be3cd6550ea03823f3e617d306b7d6dab795c" exitCode=0 Mar 07 07:36:23 crc kubenswrapper[4941]: I0307 07:36:23.836053 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vnct" event={"ID":"bafc086d-3932-4f4e-95e1-02fbef009d6a","Type":"ContainerDied","Data":"478e029b11249e2cc881453a5e3be3cd6550ea03823f3e617d306b7d6dab795c"} Mar 07 07:36:26 crc kubenswrapper[4941]: I0307 07:36:26.875169 4941 generic.go:334] "Generic (PLEG): container finished" podID="bafc086d-3932-4f4e-95e1-02fbef009d6a" containerID="73b0e7253e989b0cf87b692fc55f4c84bea42e88249489fd45543a9812cffa97" exitCode=0 Mar 07 07:36:26 crc kubenswrapper[4941]: I0307 07:36:26.875259 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vnct" event={"ID":"bafc086d-3932-4f4e-95e1-02fbef009d6a","Type":"ContainerDied","Data":"73b0e7253e989b0cf87b692fc55f4c84bea42e88249489fd45543a9812cffa97"} Mar 07 07:36:28 crc kubenswrapper[4941]: I0307 07:36:28.900702 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vnct" event={"ID":"bafc086d-3932-4f4e-95e1-02fbef009d6a","Type":"ContainerStarted","Data":"c83781746eb0b8ac659a83427161702bb94a671d9e5ff5d675ff12cf609bbbf3"} Mar 07 07:36:31 crc kubenswrapper[4941]: I0307 07:36:31.554651 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:31 crc kubenswrapper[4941]: I0307 07:36:31.554967 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:31 crc kubenswrapper[4941]: I0307 07:36:31.630435 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:31 crc kubenswrapper[4941]: I0307 07:36:31.661996 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4vnct" podStartSLOduration=6.320189614 podStartE2EDuration="10.661979134s" podCreationTimestamp="2026-03-07 07:36:21 +0000 UTC" firstStartedPulling="2026-03-07 07:36:23.837938033 +0000 UTC m=+2680.790303508" lastFinishedPulling="2026-03-07 07:36:28.179727523 +0000 UTC m=+2685.132093028" observedRunningTime="2026-03-07 07:36:28.924853908 +0000 UTC m=+2685.877219383" watchObservedRunningTime="2026-03-07 07:36:31.661979134 +0000 UTC m=+2688.614344599" Mar 07 07:36:41 crc kubenswrapper[4941]: I0307 07:36:41.626727 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:41 crc kubenswrapper[4941]: I0307 07:36:41.698597 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4vnct"] Mar 07 07:36:42 crc kubenswrapper[4941]: I0307 07:36:42.006065 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4vnct" podUID="bafc086d-3932-4f4e-95e1-02fbef009d6a" containerName="registry-server" containerID="cri-o://c83781746eb0b8ac659a83427161702bb94a671d9e5ff5d675ff12cf609bbbf3" gracePeriod=2 Mar 07 07:36:42 crc kubenswrapper[4941]: I0307 07:36:42.953646 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.019084 4941 generic.go:334] "Generic (PLEG): container finished" podID="bafc086d-3932-4f4e-95e1-02fbef009d6a" containerID="c83781746eb0b8ac659a83427161702bb94a671d9e5ff5d675ff12cf609bbbf3" exitCode=0 Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.019768 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vnct" event={"ID":"bafc086d-3932-4f4e-95e1-02fbef009d6a","Type":"ContainerDied","Data":"c83781746eb0b8ac659a83427161702bb94a671d9e5ff5d675ff12cf609bbbf3"} Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.019888 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vnct" event={"ID":"bafc086d-3932-4f4e-95e1-02fbef009d6a","Type":"ContainerDied","Data":"2018965fffb55832739d6497e53d560d06a3cb72e0bf241a40a13e13dcd05040"} Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.019918 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vnct" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.019919 4941 scope.go:117] "RemoveContainer" containerID="c83781746eb0b8ac659a83427161702bb94a671d9e5ff5d675ff12cf609bbbf3" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.044804 4941 scope.go:117] "RemoveContainer" containerID="73b0e7253e989b0cf87b692fc55f4c84bea42e88249489fd45543a9812cffa97" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.072260 4941 scope.go:117] "RemoveContainer" containerID="478e029b11249e2cc881453a5e3be3cd6550ea03823f3e617d306b7d6dab795c" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.090109 4941 scope.go:117] "RemoveContainer" containerID="c83781746eb0b8ac659a83427161702bb94a671d9e5ff5d675ff12cf609bbbf3" Mar 07 07:36:43 crc kubenswrapper[4941]: E0307 07:36:43.090788 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c83781746eb0b8ac659a83427161702bb94a671d9e5ff5d675ff12cf609bbbf3\": container with ID starting with c83781746eb0b8ac659a83427161702bb94a671d9e5ff5d675ff12cf609bbbf3 not found: ID does not exist" containerID="c83781746eb0b8ac659a83427161702bb94a671d9e5ff5d675ff12cf609bbbf3" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.090885 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c83781746eb0b8ac659a83427161702bb94a671d9e5ff5d675ff12cf609bbbf3"} err="failed to get container status \"c83781746eb0b8ac659a83427161702bb94a671d9e5ff5d675ff12cf609bbbf3\": rpc error: code = NotFound desc = could not find container \"c83781746eb0b8ac659a83427161702bb94a671d9e5ff5d675ff12cf609bbbf3\": container with ID starting with c83781746eb0b8ac659a83427161702bb94a671d9e5ff5d675ff12cf609bbbf3 not found: ID does not exist" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.090987 4941 scope.go:117] "RemoveContainer" containerID="73b0e7253e989b0cf87b692fc55f4c84bea42e88249489fd45543a9812cffa97" Mar 07 07:36:43 crc kubenswrapper[4941]: E0307 07:36:43.091437 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73b0e7253e989b0cf87b692fc55f4c84bea42e88249489fd45543a9812cffa97\": container with ID starting with 73b0e7253e989b0cf87b692fc55f4c84bea42e88249489fd45543a9812cffa97 not found: ID does not exist" containerID="73b0e7253e989b0cf87b692fc55f4c84bea42e88249489fd45543a9812cffa97" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.091508 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73b0e7253e989b0cf87b692fc55f4c84bea42e88249489fd45543a9812cffa97"} err="failed to get container status \"73b0e7253e989b0cf87b692fc55f4c84bea42e88249489fd45543a9812cffa97\": rpc error: code = NotFound desc = could not find container \"73b0e7253e989b0cf87b692fc55f4c84bea42e88249489fd45543a9812cffa97\": container with ID starting with 73b0e7253e989b0cf87b692fc55f4c84bea42e88249489fd45543a9812cffa97 not found: ID does not exist" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.091544 4941 scope.go:117] "RemoveContainer" containerID="478e029b11249e2cc881453a5e3be3cd6550ea03823f3e617d306b7d6dab795c" Mar 07 07:36:43 crc kubenswrapper[4941]: E0307 07:36:43.091892 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478e029b11249e2cc881453a5e3be3cd6550ea03823f3e617d306b7d6dab795c\": container with ID starting with 478e029b11249e2cc881453a5e3be3cd6550ea03823f3e617d306b7d6dab795c not found: ID does not exist" containerID="478e029b11249e2cc881453a5e3be3cd6550ea03823f3e617d306b7d6dab795c" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.091994 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478e029b11249e2cc881453a5e3be3cd6550ea03823f3e617d306b7d6dab795c"} err="failed to get container status \"478e029b11249e2cc881453a5e3be3cd6550ea03823f3e617d306b7d6dab795c\": rpc error: code = NotFound desc = could not find container \"478e029b11249e2cc881453a5e3be3cd6550ea03823f3e617d306b7d6dab795c\": container with ID starting with 478e029b11249e2cc881453a5e3be3cd6550ea03823f3e617d306b7d6dab795c not found: ID does not exist" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.121907 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafc086d-3932-4f4e-95e1-02fbef009d6a-utilities\") pod \"bafc086d-3932-4f4e-95e1-02fbef009d6a\" (UID: \"bafc086d-3932-4f4e-95e1-02fbef009d6a\") " Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.122303 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xpk7\" (UniqueName: \"kubernetes.io/projected/bafc086d-3932-4f4e-95e1-02fbef009d6a-kube-api-access-4xpk7\") pod \"bafc086d-3932-4f4e-95e1-02fbef009d6a\" (UID: \"bafc086d-3932-4f4e-95e1-02fbef009d6a\") " Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.122468 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafc086d-3932-4f4e-95e1-02fbef009d6a-catalog-content\") pod \"bafc086d-3932-4f4e-95e1-02fbef009d6a\" (UID: \"bafc086d-3932-4f4e-95e1-02fbef009d6a\") " Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.123817 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bafc086d-3932-4f4e-95e1-02fbef009d6a-utilities" (OuterVolumeSpecName: "utilities") pod "bafc086d-3932-4f4e-95e1-02fbef009d6a" (UID: "bafc086d-3932-4f4e-95e1-02fbef009d6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.130685 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafc086d-3932-4f4e-95e1-02fbef009d6a-kube-api-access-4xpk7" (OuterVolumeSpecName: "kube-api-access-4xpk7") pod "bafc086d-3932-4f4e-95e1-02fbef009d6a" (UID: "bafc086d-3932-4f4e-95e1-02fbef009d6a"). InnerVolumeSpecName "kube-api-access-4xpk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.176539 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bafc086d-3932-4f4e-95e1-02fbef009d6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bafc086d-3932-4f4e-95e1-02fbef009d6a" (UID: "bafc086d-3932-4f4e-95e1-02fbef009d6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.224223 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafc086d-3932-4f4e-95e1-02fbef009d6a-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.224270 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xpk7\" (UniqueName: \"kubernetes.io/projected/bafc086d-3932-4f4e-95e1-02fbef009d6a-kube-api-access-4xpk7\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.224280 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafc086d-3932-4f4e-95e1-02fbef009d6a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.369334 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4vnct"] Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.374519 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4vnct"] Mar 07 07:36:43 crc kubenswrapper[4941]: I0307 07:36:43.970197 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bafc086d-3932-4f4e-95e1-02fbef009d6a" path="/var/lib/kubelet/pods/bafc086d-3932-4f4e-95e1-02fbef009d6a/volumes" Mar 07 07:37:10 crc kubenswrapper[4941]: I0307 07:37:10.314288 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:37:10 crc kubenswrapper[4941]: I0307 07:37:10.314945 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:37:40 crc kubenswrapper[4941]: I0307 07:37:40.314045 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:37:40 crc kubenswrapper[4941]: I0307 07:37:40.314891 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.152955 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547818-hpvpq"] Mar 07 07:38:00 crc kubenswrapper[4941]: E0307 07:38:00.153820 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafc086d-3932-4f4e-95e1-02fbef009d6a" containerName="registry-server" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.153835 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafc086d-3932-4f4e-95e1-02fbef009d6a" containerName="registry-server" Mar 07 07:38:00 crc kubenswrapper[4941]: E0307 07:38:00.153852 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafc086d-3932-4f4e-95e1-02fbef009d6a" containerName="extract-content" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.153858 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafc086d-3932-4f4e-95e1-02fbef009d6a" containerName="extract-content" Mar 07 07:38:00 crc kubenswrapper[4941]: E0307 07:38:00.153870 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafc086d-3932-4f4e-95e1-02fbef009d6a" containerName="extract-utilities" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.153878 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafc086d-3932-4f4e-95e1-02fbef009d6a" containerName="extract-utilities" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.154032 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="bafc086d-3932-4f4e-95e1-02fbef009d6a" containerName="registry-server" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.154622 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547818-hpvpq" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.162723 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547818-hpvpq"] Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.163632 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.163970 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.164127 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.291024 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqwfg\" (UniqueName: \"kubernetes.io/projected/1226f6c7-31f5-4897-b578-cd433a603dc0-kube-api-access-kqwfg\") pod \"auto-csr-approver-29547818-hpvpq\" (UID: \"1226f6c7-31f5-4897-b578-cd433a603dc0\") " pod="openshift-infra/auto-csr-approver-29547818-hpvpq" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.392327 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqwfg\" (UniqueName: \"kubernetes.io/projected/1226f6c7-31f5-4897-b578-cd433a603dc0-kube-api-access-kqwfg\") pod \"auto-csr-approver-29547818-hpvpq\" (UID: \"1226f6c7-31f5-4897-b578-cd433a603dc0\") " pod="openshift-infra/auto-csr-approver-29547818-hpvpq" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.411394 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqwfg\" (UniqueName: \"kubernetes.io/projected/1226f6c7-31f5-4897-b578-cd433a603dc0-kube-api-access-kqwfg\") pod \"auto-csr-approver-29547818-hpvpq\" (UID: \"1226f6c7-31f5-4897-b578-cd433a603dc0\") " pod="openshift-infra/auto-csr-approver-29547818-hpvpq" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.495868 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547818-hpvpq" Mar 07 07:38:00 crc kubenswrapper[4941]: I0307 07:38:00.913204 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547818-hpvpq"] Mar 07 07:38:01 crc kubenswrapper[4941]: I0307 07:38:01.924048 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547818-hpvpq" event={"ID":"1226f6c7-31f5-4897-b578-cd433a603dc0","Type":"ContainerStarted","Data":"cc9349a8a83f5a681b35900f9cbed24e6fc38d399797c1320a28d386331e6fa9"} Mar 07 07:38:02 crc kubenswrapper[4941]: I0307 07:38:02.935477 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547818-hpvpq" event={"ID":"1226f6c7-31f5-4897-b578-cd433a603dc0","Type":"ContainerStarted","Data":"08a0a52c6e27a3e21d4515d4c94a5193d3627154fb6406255e04023201b7b8c7"} Mar 07 07:38:03 crc kubenswrapper[4941]: I0307 07:38:03.945544 4941 generic.go:334] "Generic (PLEG): container finished" podID="1226f6c7-31f5-4897-b578-cd433a603dc0" containerID="08a0a52c6e27a3e21d4515d4c94a5193d3627154fb6406255e04023201b7b8c7" exitCode=0 Mar 07 07:38:03 crc kubenswrapper[4941]: I0307 07:38:03.945607 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547818-hpvpq" event={"ID":"1226f6c7-31f5-4897-b578-cd433a603dc0","Type":"ContainerDied","Data":"08a0a52c6e27a3e21d4515d4c94a5193d3627154fb6406255e04023201b7b8c7"} Mar 07 07:38:05 crc kubenswrapper[4941]: I0307 07:38:05.256365 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547818-hpvpq" Mar 07 07:38:05 crc kubenswrapper[4941]: I0307 07:38:05.305793 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqwfg\" (UniqueName: \"kubernetes.io/projected/1226f6c7-31f5-4897-b578-cd433a603dc0-kube-api-access-kqwfg\") pod \"1226f6c7-31f5-4897-b578-cd433a603dc0\" (UID: \"1226f6c7-31f5-4897-b578-cd433a603dc0\") " Mar 07 07:38:05 crc kubenswrapper[4941]: I0307 07:38:05.312904 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1226f6c7-31f5-4897-b578-cd433a603dc0-kube-api-access-kqwfg" (OuterVolumeSpecName: "kube-api-access-kqwfg") pod "1226f6c7-31f5-4897-b578-cd433a603dc0" (UID: "1226f6c7-31f5-4897-b578-cd433a603dc0"). InnerVolumeSpecName "kube-api-access-kqwfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:38:05 crc kubenswrapper[4941]: I0307 07:38:05.408372 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqwfg\" (UniqueName: \"kubernetes.io/projected/1226f6c7-31f5-4897-b578-cd433a603dc0-kube-api-access-kqwfg\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:05 crc kubenswrapper[4941]: I0307 07:38:05.963355 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547818-hpvpq" Mar 07 07:38:05 crc kubenswrapper[4941]: I0307 07:38:05.968504 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547818-hpvpq" event={"ID":"1226f6c7-31f5-4897-b578-cd433a603dc0","Type":"ContainerDied","Data":"cc9349a8a83f5a681b35900f9cbed24e6fc38d399797c1320a28d386331e6fa9"} Mar 07 07:38:05 crc kubenswrapper[4941]: I0307 07:38:05.968537 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc9349a8a83f5a681b35900f9cbed24e6fc38d399797c1320a28d386331e6fa9" Mar 07 07:38:06 crc kubenswrapper[4941]: I0307 07:38:06.388798 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547812-qrf7b"] Mar 07 07:38:06 crc kubenswrapper[4941]: I0307 07:38:06.400578 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547812-qrf7b"] Mar 07 07:38:07 crc kubenswrapper[4941]: I0307 07:38:07.968911 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d39bd1fe-6d59-4d98-96a6-8e80db39b490" path="/var/lib/kubelet/pods/d39bd1fe-6d59-4d98-96a6-8e80db39b490/volumes" Mar 07 07:38:08 crc kubenswrapper[4941]: I0307 07:38:08.740770 4941 scope.go:117] "RemoveContainer" containerID="54c4abff890da20264731abb5419cee796457fe5256455e4855bb09e0728050c" Mar 07 07:38:10 crc kubenswrapper[4941]: I0307 07:38:10.313795 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:38:10 crc kubenswrapper[4941]: I0307 07:38:10.313864 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:38:10 crc kubenswrapper[4941]: I0307 07:38:10.313916 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 07:38:10 crc kubenswrapper[4941]: I0307 07:38:10.314686 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88ca66b5a032f60fea483bfc96920dae763c35b858a0aa50b1a11df5e9201ea8"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:38:10 crc kubenswrapper[4941]: I0307 07:38:10.314777 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://88ca66b5a032f60fea483bfc96920dae763c35b858a0aa50b1a11df5e9201ea8" gracePeriod=600 Mar 07 07:38:11 crc kubenswrapper[4941]: I0307 07:38:11.015347 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="88ca66b5a032f60fea483bfc96920dae763c35b858a0aa50b1a11df5e9201ea8" exitCode=0 Mar 07 07:38:11 crc kubenswrapper[4941]: I0307 07:38:11.016046 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"88ca66b5a032f60fea483bfc96920dae763c35b858a0aa50b1a11df5e9201ea8"} Mar 07 07:38:11 crc kubenswrapper[4941]: I0307 07:38:11.016094 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070"} Mar 07 07:38:11 crc kubenswrapper[4941]: I0307 07:38:11.016141 4941 scope.go:117] "RemoveContainer" containerID="98e8d92be6034b796af7f94f2efd8b8a55f9b522d2c1559f43a8c26502754e85" Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.760152 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8sk94"] Mar 07 07:38:39 crc kubenswrapper[4941]: E0307 07:38:39.761429 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1226f6c7-31f5-4897-b578-cd433a603dc0" containerName="oc" Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.761455 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1226f6c7-31f5-4897-b578-cd433a603dc0" containerName="oc" Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.761744 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="1226f6c7-31f5-4897-b578-cd433a603dc0" containerName="oc" Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.763555 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.803388 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8sk94"] Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.823952 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ngfq\" (UniqueName: \"kubernetes.io/projected/f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59-kube-api-access-7ngfq\") pod \"community-operators-8sk94\" (UID: \"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59\") " pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.824103 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59-catalog-content\") pod \"community-operators-8sk94\" (UID: \"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59\") " pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.824140 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59-utilities\") pod \"community-operators-8sk94\" (UID: \"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59\") " pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.925115 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59-catalog-content\") pod \"community-operators-8sk94\" (UID: \"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59\") " pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.925160 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59-utilities\") pod \"community-operators-8sk94\" (UID: \"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59\") " pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.925234 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ngfq\" (UniqueName: \"kubernetes.io/projected/f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59-kube-api-access-7ngfq\") pod \"community-operators-8sk94\" (UID: \"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59\") " pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.926118 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59-catalog-content\") pod \"community-operators-8sk94\" (UID: \"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59\") " pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.926225 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59-utilities\") pod \"community-operators-8sk94\" (UID: \"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59\") " pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:38:39 crc kubenswrapper[4941]: I0307 07:38:39.979158 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ngfq\" (UniqueName: \"kubernetes.io/projected/f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59-kube-api-access-7ngfq\") pod \"community-operators-8sk94\" (UID: \"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59\") " pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:38:40 crc kubenswrapper[4941]: I0307 07:38:40.100522 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:38:40 crc kubenswrapper[4941]: I0307 07:38:40.682456 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8sk94"] Mar 07 07:38:41 crc kubenswrapper[4941]: I0307 07:38:41.553990 4941 generic.go:334] "Generic (PLEG): container finished" podID="f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59" containerID="98e7b24d32d38184ffba923331acf70a81e52d16db7f89ba41658f42d697da3b" exitCode=0 Mar 07 07:38:41 crc kubenswrapper[4941]: I0307 07:38:41.554119 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8sk94" event={"ID":"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59","Type":"ContainerDied","Data":"98e7b24d32d38184ffba923331acf70a81e52d16db7f89ba41658f42d697da3b"} Mar 07 07:38:41 crc kubenswrapper[4941]: I0307 07:38:41.554596 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8sk94" event={"ID":"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59","Type":"ContainerStarted","Data":"82172b464305b59781bd32725d1222b82042730cc5c6d381f41da5df8a4950f7"} Mar 07 07:38:46 crc kubenswrapper[4941]: I0307 07:38:46.612176 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8sk94" event={"ID":"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59","Type":"ContainerStarted","Data":"7141293ae4affda9422d42df23511d2dff013ef3a99c09c4eed4c1c02f06b8d5"} Mar 07 07:38:47 crc kubenswrapper[4941]: I0307 07:38:47.620592 4941 generic.go:334] "Generic (PLEG): container finished" podID="f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59" containerID="7141293ae4affda9422d42df23511d2dff013ef3a99c09c4eed4c1c02f06b8d5" exitCode=0 Mar 07 07:38:47 crc kubenswrapper[4941]: I0307 07:38:47.620638 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8sk94" event={"ID":"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59","Type":"ContainerDied","Data":"7141293ae4affda9422d42df23511d2dff013ef3a99c09c4eed4c1c02f06b8d5"} Mar 07 07:38:49 crc kubenswrapper[4941]: I0307 07:38:49.639124 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8sk94" event={"ID":"f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59","Type":"ContainerStarted","Data":"99478ee4c871a6576e1df8a54086b6d204942c48d9456800e6d3dd6d3c40859d"} Mar 07 07:38:50 crc kubenswrapper[4941]: I0307 07:38:50.102097 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:38:50 crc kubenswrapper[4941]: I0307 07:38:50.102597 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:38:51 crc kubenswrapper[4941]: I0307 07:38:51.145567 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8sk94" podUID="f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59" containerName="registry-server" probeResult="failure" output=< Mar 07 07:38:51 crc kubenswrapper[4941]: timeout: failed to connect service ":50051" within 1s Mar 07 07:38:51 crc kubenswrapper[4941]: > Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.180509 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8sk94" podStartSLOduration=10.142912746 podStartE2EDuration="17.180484859s" podCreationTimestamp="2026-03-07 07:38:39 +0000 UTC" firstStartedPulling="2026-03-07 07:38:41.558318228 +0000 UTC m=+2818.510683733" lastFinishedPulling="2026-03-07 07:38:48.595890371 +0000 UTC m=+2825.548255846" observedRunningTime="2026-03-07 07:38:49.664196634 +0000 UTC m=+2826.616562109" watchObservedRunningTime="2026-03-07 07:38:56.180484859 +0000 UTC m=+2833.132850334" Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.183184 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xrhtl"] Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.185909 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.195017 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrhtl"] Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.261561 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd9gx\" (UniqueName: \"kubernetes.io/projected/3d20d3ea-6e74-43a5-8767-6801b219a8f2-kube-api-access-sd9gx\") pod \"redhat-operators-xrhtl\" (UID: \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\") " pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.261693 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d20d3ea-6e74-43a5-8767-6801b219a8f2-catalog-content\") pod \"redhat-operators-xrhtl\" (UID: \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\") " pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.261746 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d20d3ea-6e74-43a5-8767-6801b219a8f2-utilities\") pod \"redhat-operators-xrhtl\" (UID: \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\") " pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.362989 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd9gx\" (UniqueName: \"kubernetes.io/projected/3d20d3ea-6e74-43a5-8767-6801b219a8f2-kube-api-access-sd9gx\") pod \"redhat-operators-xrhtl\" (UID: \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\") " pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.363082 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d20d3ea-6e74-43a5-8767-6801b219a8f2-catalog-content\") pod \"redhat-operators-xrhtl\" (UID: \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\") " pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.363136 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d20d3ea-6e74-43a5-8767-6801b219a8f2-utilities\") pod \"redhat-operators-xrhtl\" (UID: \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\") " pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.363600 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d20d3ea-6e74-43a5-8767-6801b219a8f2-utilities\") pod \"redhat-operators-xrhtl\" (UID: \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\") " pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.363691 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d20d3ea-6e74-43a5-8767-6801b219a8f2-catalog-content\") pod \"redhat-operators-xrhtl\" (UID: \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\") " pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.392729 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd9gx\" (UniqueName: \"kubernetes.io/projected/3d20d3ea-6e74-43a5-8767-6801b219a8f2-kube-api-access-sd9gx\") pod \"redhat-operators-xrhtl\" (UID: \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\") " pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.510119 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:38:56 crc kubenswrapper[4941]: I0307 07:38:56.948713 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrhtl"] Mar 07 07:38:56 crc kubenswrapper[4941]: W0307 07:38:56.950542 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d20d3ea_6e74_43a5_8767_6801b219a8f2.slice/crio-e4b5f37977eb2df8591ff60e7fa395bbb54ae9cac26fb0cd8a633b91ce10ad81 WatchSource:0}: Error finding container e4b5f37977eb2df8591ff60e7fa395bbb54ae9cac26fb0cd8a633b91ce10ad81: Status 404 returned error can't find the container with id e4b5f37977eb2df8591ff60e7fa395bbb54ae9cac26fb0cd8a633b91ce10ad81 Mar 07 07:38:57 crc kubenswrapper[4941]: I0307 07:38:57.708125 4941 generic.go:334] "Generic (PLEG): container finished" podID="3d20d3ea-6e74-43a5-8767-6801b219a8f2" containerID="912b5be2be550cd5840ab4ca70ea8c9a178ffb10e6bc51463e9abcebe978ba98" exitCode=0 Mar 07 07:38:57 crc kubenswrapper[4941]: I0307 07:38:57.708193 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhtl" event={"ID":"3d20d3ea-6e74-43a5-8767-6801b219a8f2","Type":"ContainerDied","Data":"912b5be2be550cd5840ab4ca70ea8c9a178ffb10e6bc51463e9abcebe978ba98"} Mar 07 07:38:57 crc kubenswrapper[4941]: I0307 07:38:57.708222 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhtl" event={"ID":"3d20d3ea-6e74-43a5-8767-6801b219a8f2","Type":"ContainerStarted","Data":"e4b5f37977eb2df8591ff60e7fa395bbb54ae9cac26fb0cd8a633b91ce10ad81"} Mar 07 07:38:58 crc kubenswrapper[4941]: I0307 07:38:58.718394 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhtl" event={"ID":"3d20d3ea-6e74-43a5-8767-6801b219a8f2","Type":"ContainerStarted","Data":"455fffedfc39caec66c1655a0d8c0fa6d3374ec7e1a914f7f5f22c38d940632e"} Mar 07 07:38:59 crc kubenswrapper[4941]: I0307 07:38:59.727333 4941 generic.go:334] "Generic (PLEG): container finished" podID="3d20d3ea-6e74-43a5-8767-6801b219a8f2" containerID="455fffedfc39caec66c1655a0d8c0fa6d3374ec7e1a914f7f5f22c38d940632e" exitCode=0 Mar 07 07:38:59 crc kubenswrapper[4941]: I0307 07:38:59.727553 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhtl" event={"ID":"3d20d3ea-6e74-43a5-8767-6801b219a8f2","Type":"ContainerDied","Data":"455fffedfc39caec66c1655a0d8c0fa6d3374ec7e1a914f7f5f22c38d940632e"} Mar 07 07:39:00 crc kubenswrapper[4941]: I0307 07:39:00.154187 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:39:00 crc kubenswrapper[4941]: I0307 07:39:00.198888 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8sk94" Mar 07 07:39:00 crc kubenswrapper[4941]: I0307 07:39:00.739226 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhtl" event={"ID":"3d20d3ea-6e74-43a5-8767-6801b219a8f2","Type":"ContainerStarted","Data":"d88649ff5ed9cac280c75fe975e8eb3efc35c562d8234c73d49b0200a4d1497c"} Mar 07 07:39:00 crc kubenswrapper[4941]: I0307 07:39:00.776519 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xrhtl" podStartSLOduration=2.36437638 podStartE2EDuration="4.776490256s" podCreationTimestamp="2026-03-07 07:38:56 +0000 UTC" firstStartedPulling="2026-03-07 07:38:57.709814528 +0000 UTC m=+2834.662179993" lastFinishedPulling="2026-03-07 07:39:00.121928394 +0000 UTC m=+2837.074293869" observedRunningTime="2026-03-07 07:39:00.76428845 +0000 UTC m=+2837.716653955" watchObservedRunningTime="2026-03-07 07:39:00.776490256 +0000 UTC m=+2837.728855761" Mar 07 07:39:02 crc kubenswrapper[4941]: I0307 07:39:02.234465 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8sk94"] Mar 07 07:39:02 crc kubenswrapper[4941]: I0307 07:39:02.578574 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nst75"] Mar 07 07:39:02 crc kubenswrapper[4941]: I0307 07:39:02.578871 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nst75" podUID="1f666a7e-4fe4-44a4-8faa-25436dfd3753" containerName="registry-server" containerID="cri-o://fc3a9c4579c2b26e078c9019735b718c51eb7637ba1f6a93ffc045b5ebcbd4ac" gracePeriod=2 Mar 07 07:39:02 crc kubenswrapper[4941]: I0307 07:39:02.752548 4941 generic.go:334] "Generic (PLEG): container finished" podID="1f666a7e-4fe4-44a4-8faa-25436dfd3753" containerID="fc3a9c4579c2b26e078c9019735b718c51eb7637ba1f6a93ffc045b5ebcbd4ac" exitCode=0 Mar 07 07:39:02 crc kubenswrapper[4941]: I0307 07:39:02.752591 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nst75" event={"ID":"1f666a7e-4fe4-44a4-8faa-25436dfd3753","Type":"ContainerDied","Data":"fc3a9c4579c2b26e078c9019735b718c51eb7637ba1f6a93ffc045b5ebcbd4ac"} Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.505370 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nst75" Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.558555 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f666a7e-4fe4-44a4-8faa-25436dfd3753-catalog-content\") pod \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\" (UID: \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\") " Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.558705 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbxv4\" (UniqueName: \"kubernetes.io/projected/1f666a7e-4fe4-44a4-8faa-25436dfd3753-kube-api-access-gbxv4\") pod \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\" (UID: \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\") " Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.558768 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f666a7e-4fe4-44a4-8faa-25436dfd3753-utilities\") pod \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\" (UID: \"1f666a7e-4fe4-44a4-8faa-25436dfd3753\") " Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.559489 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f666a7e-4fe4-44a4-8faa-25436dfd3753-utilities" (OuterVolumeSpecName: "utilities") pod "1f666a7e-4fe4-44a4-8faa-25436dfd3753" (UID: "1f666a7e-4fe4-44a4-8faa-25436dfd3753"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.564778 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f666a7e-4fe4-44a4-8faa-25436dfd3753-kube-api-access-gbxv4" (OuterVolumeSpecName: "kube-api-access-gbxv4") pod "1f666a7e-4fe4-44a4-8faa-25436dfd3753" (UID: "1f666a7e-4fe4-44a4-8faa-25436dfd3753"). InnerVolumeSpecName "kube-api-access-gbxv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.613293 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f666a7e-4fe4-44a4-8faa-25436dfd3753-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f666a7e-4fe4-44a4-8faa-25436dfd3753" (UID: "1f666a7e-4fe4-44a4-8faa-25436dfd3753"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.659951 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbxv4\" (UniqueName: \"kubernetes.io/projected/1f666a7e-4fe4-44a4-8faa-25436dfd3753-kube-api-access-gbxv4\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.659988 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f666a7e-4fe4-44a4-8faa-25436dfd3753-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.660000 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f666a7e-4fe4-44a4-8faa-25436dfd3753-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.761038 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nst75" event={"ID":"1f666a7e-4fe4-44a4-8faa-25436dfd3753","Type":"ContainerDied","Data":"d8e8b972218ba1f5285fc751bb147ca789f22db3493987877f934ac910fcb33e"} Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.761092 4941 scope.go:117] "RemoveContainer" containerID="fc3a9c4579c2b26e078c9019735b718c51eb7637ba1f6a93ffc045b5ebcbd4ac" Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.761226 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nst75" Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.788625 4941 scope.go:117] "RemoveContainer" containerID="cfbe5382a505fa5b203a29534fbca11abf2f5bcfeae42f0efb4f0625da36ca7d" Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.788767 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nst75"] Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.792580 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nst75"] Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.816576 4941 scope.go:117] "RemoveContainer" containerID="4124e3f5a055901b92a94ec2098c0f08d14635bfc57d16a43caae8e608189c7b" Mar 07 07:39:03 crc kubenswrapper[4941]: I0307 07:39:03.963848 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f666a7e-4fe4-44a4-8faa-25436dfd3753" path="/var/lib/kubelet/pods/1f666a7e-4fe4-44a4-8faa-25436dfd3753/volumes" Mar 07 07:39:06 crc kubenswrapper[4941]: I0307 07:39:06.510715 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:39:06 crc kubenswrapper[4941]: I0307 07:39:06.510978 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:39:07 crc kubenswrapper[4941]: I0307 07:39:07.558908 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xrhtl" podUID="3d20d3ea-6e74-43a5-8767-6801b219a8f2" containerName="registry-server" probeResult="failure" output=< Mar 07 07:39:07 crc kubenswrapper[4941]: timeout: failed to connect service ":50051" within 1s Mar 07 07:39:07 crc kubenswrapper[4941]: > Mar 07 07:39:16 crc kubenswrapper[4941]: I0307 07:39:16.578478 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:39:16 crc kubenswrapper[4941]: I0307 07:39:16.660017 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:39:16 crc kubenswrapper[4941]: I0307 07:39:16.825046 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrhtl"] Mar 07 07:39:17 crc kubenswrapper[4941]: I0307 07:39:17.858661 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xrhtl" podUID="3d20d3ea-6e74-43a5-8767-6801b219a8f2" containerName="registry-server" containerID="cri-o://d88649ff5ed9cac280c75fe975e8eb3efc35c562d8234c73d49b0200a4d1497c" gracePeriod=2 Mar 07 07:39:18 crc kubenswrapper[4941]: I0307 07:39:18.867807 4941 generic.go:334] "Generic (PLEG): container finished" podID="3d20d3ea-6e74-43a5-8767-6801b219a8f2" containerID="d88649ff5ed9cac280c75fe975e8eb3efc35c562d8234c73d49b0200a4d1497c" exitCode=0 Mar 07 07:39:18 crc kubenswrapper[4941]: I0307 07:39:18.867868 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhtl" event={"ID":"3d20d3ea-6e74-43a5-8767-6801b219a8f2","Type":"ContainerDied","Data":"d88649ff5ed9cac280c75fe975e8eb3efc35c562d8234c73d49b0200a4d1497c"} Mar 07 07:39:18 crc kubenswrapper[4941]: I0307 07:39:18.986779 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.071229 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd9gx\" (UniqueName: \"kubernetes.io/projected/3d20d3ea-6e74-43a5-8767-6801b219a8f2-kube-api-access-sd9gx\") pod \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\" (UID: \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\") " Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.071303 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d20d3ea-6e74-43a5-8767-6801b219a8f2-utilities\") pod \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\" (UID: \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\") " Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.071362 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d20d3ea-6e74-43a5-8767-6801b219a8f2-catalog-content\") pod \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\" (UID: \"3d20d3ea-6e74-43a5-8767-6801b219a8f2\") " Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.073055 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d20d3ea-6e74-43a5-8767-6801b219a8f2-utilities" (OuterVolumeSpecName: "utilities") pod "3d20d3ea-6e74-43a5-8767-6801b219a8f2" (UID: "3d20d3ea-6e74-43a5-8767-6801b219a8f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.078554 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d20d3ea-6e74-43a5-8767-6801b219a8f2-kube-api-access-sd9gx" (OuterVolumeSpecName: "kube-api-access-sd9gx") pod "3d20d3ea-6e74-43a5-8767-6801b219a8f2" (UID: "3d20d3ea-6e74-43a5-8767-6801b219a8f2"). InnerVolumeSpecName "kube-api-access-sd9gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.173664 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd9gx\" (UniqueName: \"kubernetes.io/projected/3d20d3ea-6e74-43a5-8767-6801b219a8f2-kube-api-access-sd9gx\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.173704 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d20d3ea-6e74-43a5-8767-6801b219a8f2-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.224275 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d20d3ea-6e74-43a5-8767-6801b219a8f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d20d3ea-6e74-43a5-8767-6801b219a8f2" (UID: "3d20d3ea-6e74-43a5-8767-6801b219a8f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.275141 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d20d3ea-6e74-43a5-8767-6801b219a8f2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.877726 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhtl" event={"ID":"3d20d3ea-6e74-43a5-8767-6801b219a8f2","Type":"ContainerDied","Data":"e4b5f37977eb2df8591ff60e7fa395bbb54ae9cac26fb0cd8a633b91ce10ad81"} Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.877776 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrhtl" Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.877784 4941 scope.go:117] "RemoveContainer" containerID="d88649ff5ed9cac280c75fe975e8eb3efc35c562d8234c73d49b0200a4d1497c" Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.920683 4941 scope.go:117] "RemoveContainer" containerID="455fffedfc39caec66c1655a0d8c0fa6d3374ec7e1a914f7f5f22c38d940632e" Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.929323 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrhtl"] Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.937399 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xrhtl"] Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.939620 4941 scope.go:117] "RemoveContainer" containerID="912b5be2be550cd5840ab4ca70ea8c9a178ffb10e6bc51463e9abcebe978ba98" Mar 07 07:39:19 crc kubenswrapper[4941]: I0307 07:39:19.968369 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d20d3ea-6e74-43a5-8767-6801b219a8f2" path="/var/lib/kubelet/pods/3d20d3ea-6e74-43a5-8767-6801b219a8f2/volumes" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.147909 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547820-7d9rb"] Mar 07 07:40:00 crc kubenswrapper[4941]: E0307 07:40:00.149089 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d20d3ea-6e74-43a5-8767-6801b219a8f2" containerName="extract-utilities" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.149112 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d20d3ea-6e74-43a5-8767-6801b219a8f2" containerName="extract-utilities" Mar 07 07:40:00 crc kubenswrapper[4941]: E0307 07:40:00.149142 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f666a7e-4fe4-44a4-8faa-25436dfd3753" containerName="extract-content" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.149156 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f666a7e-4fe4-44a4-8faa-25436dfd3753" containerName="extract-content" Mar 07 07:40:00 crc kubenswrapper[4941]: E0307 07:40:00.149178 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f666a7e-4fe4-44a4-8faa-25436dfd3753" containerName="registry-server" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.149192 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f666a7e-4fe4-44a4-8faa-25436dfd3753" containerName="registry-server" Mar 07 07:40:00 crc kubenswrapper[4941]: E0307 07:40:00.149211 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d20d3ea-6e74-43a5-8767-6801b219a8f2" containerName="registry-server" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.149224 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d20d3ea-6e74-43a5-8767-6801b219a8f2" containerName="registry-server" Mar 07 07:40:00 crc kubenswrapper[4941]: E0307 07:40:00.149246 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d20d3ea-6e74-43a5-8767-6801b219a8f2" containerName="extract-content" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.149259 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d20d3ea-6e74-43a5-8767-6801b219a8f2" containerName="extract-content" Mar 07 07:40:00 crc kubenswrapper[4941]: E0307 07:40:00.149289 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f666a7e-4fe4-44a4-8faa-25436dfd3753" containerName="extract-utilities" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.149303 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f666a7e-4fe4-44a4-8faa-25436dfd3753" containerName="extract-utilities" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.149587 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d20d3ea-6e74-43a5-8767-6801b219a8f2" containerName="registry-server" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.149628 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f666a7e-4fe4-44a4-8faa-25436dfd3753" containerName="registry-server" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.150361 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547820-7d9rb" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.153849 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.154160 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.160766 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.161652 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547820-7d9rb"] Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.287768 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swpmt\" (UniqueName: \"kubernetes.io/projected/54d37680-772d-48a1-b229-4e91565a28ad-kube-api-access-swpmt\") pod \"auto-csr-approver-29547820-7d9rb\" (UID: \"54d37680-772d-48a1-b229-4e91565a28ad\") " pod="openshift-infra/auto-csr-approver-29547820-7d9rb" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.389935 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swpmt\" (UniqueName: \"kubernetes.io/projected/54d37680-772d-48a1-b229-4e91565a28ad-kube-api-access-swpmt\") pod \"auto-csr-approver-29547820-7d9rb\" (UID: \"54d37680-772d-48a1-b229-4e91565a28ad\") " pod="openshift-infra/auto-csr-approver-29547820-7d9rb" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.409808 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swpmt\" (UniqueName: \"kubernetes.io/projected/54d37680-772d-48a1-b229-4e91565a28ad-kube-api-access-swpmt\") pod \"auto-csr-approver-29547820-7d9rb\" (UID: \"54d37680-772d-48a1-b229-4e91565a28ad\") " pod="openshift-infra/auto-csr-approver-29547820-7d9rb" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.473249 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547820-7d9rb" Mar 07 07:40:00 crc kubenswrapper[4941]: I0307 07:40:00.894177 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547820-7d9rb"] Mar 07 07:40:01 crc kubenswrapper[4941]: I0307 07:40:01.429019 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547820-7d9rb" event={"ID":"54d37680-772d-48a1-b229-4e91565a28ad","Type":"ContainerStarted","Data":"e2c976b2b7c1967163b25d4a390e686f40371c54fe35def4e64ca3ec9ecd0301"} Mar 07 07:40:03 crc kubenswrapper[4941]: I0307 07:40:03.445888 4941 generic.go:334] "Generic (PLEG): container finished" podID="54d37680-772d-48a1-b229-4e91565a28ad" containerID="4ece2f8ba8ad11a2f62165c6184fdc93539e9d30b454fcccbcb58bde8698c1ee" exitCode=0 Mar 07 07:40:03 crc kubenswrapper[4941]: I0307 07:40:03.445970 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547820-7d9rb" event={"ID":"54d37680-772d-48a1-b229-4e91565a28ad","Type":"ContainerDied","Data":"4ece2f8ba8ad11a2f62165c6184fdc93539e9d30b454fcccbcb58bde8698c1ee"} Mar 07 07:40:04 crc kubenswrapper[4941]: I0307 07:40:04.782780 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547820-7d9rb" Mar 07 07:40:04 crc kubenswrapper[4941]: I0307 07:40:04.858749 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swpmt\" (UniqueName: \"kubernetes.io/projected/54d37680-772d-48a1-b229-4e91565a28ad-kube-api-access-swpmt\") pod \"54d37680-772d-48a1-b229-4e91565a28ad\" (UID: \"54d37680-772d-48a1-b229-4e91565a28ad\") " Mar 07 07:40:04 crc kubenswrapper[4941]: I0307 07:40:04.865852 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d37680-772d-48a1-b229-4e91565a28ad-kube-api-access-swpmt" (OuterVolumeSpecName: "kube-api-access-swpmt") pod "54d37680-772d-48a1-b229-4e91565a28ad" (UID: "54d37680-772d-48a1-b229-4e91565a28ad"). InnerVolumeSpecName "kube-api-access-swpmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:04 crc kubenswrapper[4941]: I0307 07:40:04.960696 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swpmt\" (UniqueName: \"kubernetes.io/projected/54d37680-772d-48a1-b229-4e91565a28ad-kube-api-access-swpmt\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:05 crc kubenswrapper[4941]: I0307 07:40:05.476501 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547820-7d9rb" event={"ID":"54d37680-772d-48a1-b229-4e91565a28ad","Type":"ContainerDied","Data":"e2c976b2b7c1967163b25d4a390e686f40371c54fe35def4e64ca3ec9ecd0301"} Mar 07 07:40:05 crc kubenswrapper[4941]: I0307 07:40:05.476551 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547820-7d9rb" Mar 07 07:40:05 crc kubenswrapper[4941]: I0307 07:40:05.476555 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2c976b2b7c1967163b25d4a390e686f40371c54fe35def4e64ca3ec9ecd0301" Mar 07 07:40:05 crc kubenswrapper[4941]: I0307 07:40:05.856759 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547814-22gtq"] Mar 07 07:40:05 crc kubenswrapper[4941]: I0307 07:40:05.867079 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547814-22gtq"] Mar 07 07:40:05 crc kubenswrapper[4941]: I0307 07:40:05.964270 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="541b3f6c-3eac-496e-a9cc-09e1834da93d" path="/var/lib/kubelet/pods/541b3f6c-3eac-496e-a9cc-09e1834da93d/volumes" Mar 07 07:40:08 crc kubenswrapper[4941]: I0307 07:40:08.829102 4941 scope.go:117] "RemoveContainer" containerID="cfaa09cf2964886758e522d378a75f62cf371319b31ba721cfe6960223f0ed3a" Mar 07 07:40:10 crc kubenswrapper[4941]: I0307 07:40:10.313799 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:40:10 crc kubenswrapper[4941]: I0307 07:40:10.314192 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:40:40 crc kubenswrapper[4941]: I0307 07:40:40.313523 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:40:40 crc kubenswrapper[4941]: I0307 07:40:40.314048 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:41:10 crc kubenswrapper[4941]: I0307 07:41:10.314292 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:41:10 crc kubenswrapper[4941]: I0307 07:41:10.315061 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:41:10 crc kubenswrapper[4941]: I0307 07:41:10.315128 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 07:41:10 crc kubenswrapper[4941]: I0307 07:41:10.315988 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:41:10 crc kubenswrapper[4941]: I0307 07:41:10.316084 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" gracePeriod=600 Mar 07 07:41:10 crc kubenswrapper[4941]: E0307 07:41:10.454922 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:41:11 crc kubenswrapper[4941]: I0307 07:41:11.042487 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" exitCode=0 Mar 07 07:41:11 crc kubenswrapper[4941]: I0307 07:41:11.042558 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070"} Mar 07 07:41:11 crc kubenswrapper[4941]: I0307 07:41:11.042611 4941 scope.go:117] "RemoveContainer" containerID="88ca66b5a032f60fea483bfc96920dae763c35b858a0aa50b1a11df5e9201ea8" Mar 07 07:41:11 crc kubenswrapper[4941]: I0307 07:41:11.043378 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:41:11 crc kubenswrapper[4941]: E0307 07:41:11.043765 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:41:24 crc kubenswrapper[4941]: I0307 07:41:24.954180 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:41:24 crc kubenswrapper[4941]: E0307 07:41:24.954967 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:41:39 crc kubenswrapper[4941]: I0307 07:41:39.955326 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:41:39 crc kubenswrapper[4941]: E0307 07:41:39.956417 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:41:50 crc kubenswrapper[4941]: I0307 07:41:50.955031 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:41:50 crc kubenswrapper[4941]: E0307 07:41:50.955931 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.156080 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547822-mvrdm"] Mar 07 07:42:00 crc kubenswrapper[4941]: E0307 07:42:00.156961 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d37680-772d-48a1-b229-4e91565a28ad" containerName="oc" Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.156976 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d37680-772d-48a1-b229-4e91565a28ad" containerName="oc" Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.157147 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d37680-772d-48a1-b229-4e91565a28ad" containerName="oc" Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.157719 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547822-mvrdm" Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.160500 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.160984 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.162798 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.165731 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547822-mvrdm"] Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.253007 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4svpb\" (UniqueName: \"kubernetes.io/projected/50992226-d92b-41d6-b68a-71a56abb9793-kube-api-access-4svpb\") pod \"auto-csr-approver-29547822-mvrdm\" (UID: \"50992226-d92b-41d6-b68a-71a56abb9793\") " pod="openshift-infra/auto-csr-approver-29547822-mvrdm" Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.354962 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4svpb\" (UniqueName: \"kubernetes.io/projected/50992226-d92b-41d6-b68a-71a56abb9793-kube-api-access-4svpb\") pod \"auto-csr-approver-29547822-mvrdm\" (UID: \"50992226-d92b-41d6-b68a-71a56abb9793\") " pod="openshift-infra/auto-csr-approver-29547822-mvrdm" Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.378308 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4svpb\" (UniqueName: \"kubernetes.io/projected/50992226-d92b-41d6-b68a-71a56abb9793-kube-api-access-4svpb\") pod \"auto-csr-approver-29547822-mvrdm\" (UID: \"50992226-d92b-41d6-b68a-71a56abb9793\") " pod="openshift-infra/auto-csr-approver-29547822-mvrdm" Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.478514 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547822-mvrdm" Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.932385 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547822-mvrdm"] Mar 07 07:42:00 crc kubenswrapper[4941]: I0307 07:42:00.937735 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:42:01 crc kubenswrapper[4941]: I0307 07:42:01.477115 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547822-mvrdm" event={"ID":"50992226-d92b-41d6-b68a-71a56abb9793","Type":"ContainerStarted","Data":"6ba77de7e868c4afa4dcdaef8d5409b9585fa73094b9c46cc3370c0b729ee5c6"} Mar 07 07:42:03 crc kubenswrapper[4941]: I0307 07:42:03.496027 4941 generic.go:334] "Generic (PLEG): container finished" podID="50992226-d92b-41d6-b68a-71a56abb9793" containerID="f2fa7da72412c1dc52e3116e58ecb16b11a0ef5baaade900c35732dfcb2dac68" exitCode=0 Mar 07 07:42:03 crc kubenswrapper[4941]: I0307 07:42:03.496200 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547822-mvrdm" event={"ID":"50992226-d92b-41d6-b68a-71a56abb9793","Type":"ContainerDied","Data":"f2fa7da72412c1dc52e3116e58ecb16b11a0ef5baaade900c35732dfcb2dac68"} Mar 07 07:42:04 crc kubenswrapper[4941]: I0307 07:42:04.803587 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547822-mvrdm" Mar 07 07:42:04 crc kubenswrapper[4941]: I0307 07:42:04.922750 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4svpb\" (UniqueName: \"kubernetes.io/projected/50992226-d92b-41d6-b68a-71a56abb9793-kube-api-access-4svpb\") pod \"50992226-d92b-41d6-b68a-71a56abb9793\" (UID: \"50992226-d92b-41d6-b68a-71a56abb9793\") " Mar 07 07:42:04 crc kubenswrapper[4941]: I0307 07:42:04.943181 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50992226-d92b-41d6-b68a-71a56abb9793-kube-api-access-4svpb" (OuterVolumeSpecName: "kube-api-access-4svpb") pod "50992226-d92b-41d6-b68a-71a56abb9793" (UID: "50992226-d92b-41d6-b68a-71a56abb9793"). InnerVolumeSpecName "kube-api-access-4svpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:42:05 crc kubenswrapper[4941]: I0307 07:42:05.024822 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4svpb\" (UniqueName: \"kubernetes.io/projected/50992226-d92b-41d6-b68a-71a56abb9793-kube-api-access-4svpb\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:05 crc kubenswrapper[4941]: I0307 07:42:05.512864 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547822-mvrdm" event={"ID":"50992226-d92b-41d6-b68a-71a56abb9793","Type":"ContainerDied","Data":"6ba77de7e868c4afa4dcdaef8d5409b9585fa73094b9c46cc3370c0b729ee5c6"} Mar 07 07:42:05 crc kubenswrapper[4941]: I0307 07:42:05.512900 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ba77de7e868c4afa4dcdaef8d5409b9585fa73094b9c46cc3370c0b729ee5c6" Mar 07 07:42:05 crc kubenswrapper[4941]: I0307 07:42:05.512964 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547822-mvrdm" Mar 07 07:42:05 crc kubenswrapper[4941]: I0307 07:42:05.883604 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547816-wbc9b"] Mar 07 07:42:05 crc kubenswrapper[4941]: I0307 07:42:05.891185 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547816-wbc9b"] Mar 07 07:42:05 crc kubenswrapper[4941]: I0307 07:42:05.955177 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:42:05 crc kubenswrapper[4941]: E0307 07:42:05.955447 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:42:05 crc kubenswrapper[4941]: I0307 07:42:05.962572 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29b8a10c-855b-46f4-844e-8e9fbb15ad39" path="/var/lib/kubelet/pods/29b8a10c-855b-46f4-844e-8e9fbb15ad39/volumes" Mar 07 07:42:08 crc kubenswrapper[4941]: I0307 07:42:08.943808 4941 scope.go:117] "RemoveContainer" containerID="16d3a4e7365a953536360e02b7a6f43b5b8ca5e991327c85c29675898aa5340e" Mar 07 07:42:18 crc kubenswrapper[4941]: I0307 07:42:18.955745 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:42:18 crc kubenswrapper[4941]: E0307 07:42:18.956634 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:42:33 crc kubenswrapper[4941]: I0307 07:42:33.961124 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:42:33 crc kubenswrapper[4941]: E0307 07:42:33.961809 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:42:45 crc kubenswrapper[4941]: I0307 07:42:45.954483 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:42:45 crc kubenswrapper[4941]: E0307 07:42:45.955241 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:42:59 crc kubenswrapper[4941]: I0307 07:42:59.955616 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:42:59 crc kubenswrapper[4941]: E0307 07:42:59.956573 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:43:12 crc kubenswrapper[4941]: I0307 07:43:12.954350 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:43:12 crc kubenswrapper[4941]: E0307 07:43:12.955190 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:43:24 crc kubenswrapper[4941]: I0307 07:43:24.954934 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:43:24 crc kubenswrapper[4941]: E0307 07:43:24.955968 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:43:37 crc kubenswrapper[4941]: I0307 07:43:37.955551 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:43:37 crc kubenswrapper[4941]: E0307 07:43:37.956669 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:43:49 crc kubenswrapper[4941]: I0307 07:43:49.954275 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:43:49 crc kubenswrapper[4941]: E0307 07:43:49.955196 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.154123 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547824-fzvrq"] Mar 07 07:44:00 crc kubenswrapper[4941]: E0307 07:44:00.155108 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50992226-d92b-41d6-b68a-71a56abb9793" containerName="oc" Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.155125 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="50992226-d92b-41d6-b68a-71a56abb9793" containerName="oc" Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.155291 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="50992226-d92b-41d6-b68a-71a56abb9793" containerName="oc" Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.155912 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547824-fzvrq" Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.157869 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.158186 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.161371 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.162243 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547824-fzvrq"] Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.251942 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7x2\" (UniqueName: \"kubernetes.io/projected/30cb74da-f324-4715-ae31-1b987e8ef17d-kube-api-access-gn7x2\") pod \"auto-csr-approver-29547824-fzvrq\" (UID: \"30cb74da-f324-4715-ae31-1b987e8ef17d\") " pod="openshift-infra/auto-csr-approver-29547824-fzvrq" Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.353441 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn7x2\" (UniqueName: \"kubernetes.io/projected/30cb74da-f324-4715-ae31-1b987e8ef17d-kube-api-access-gn7x2\") pod \"auto-csr-approver-29547824-fzvrq\" (UID: \"30cb74da-f324-4715-ae31-1b987e8ef17d\") " pod="openshift-infra/auto-csr-approver-29547824-fzvrq" Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.371431 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn7x2\" (UniqueName: \"kubernetes.io/projected/30cb74da-f324-4715-ae31-1b987e8ef17d-kube-api-access-gn7x2\") pod \"auto-csr-approver-29547824-fzvrq\" (UID: \"30cb74da-f324-4715-ae31-1b987e8ef17d\") " pod="openshift-infra/auto-csr-approver-29547824-fzvrq" Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.472836 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547824-fzvrq" Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.704445 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547824-fzvrq"] Mar 07 07:44:00 crc kubenswrapper[4941]: I0307 07:44:00.954481 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:44:00 crc kubenswrapper[4941]: E0307 07:44:00.954886 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:44:01 crc kubenswrapper[4941]: I0307 07:44:01.709623 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547824-fzvrq" event={"ID":"30cb74da-f324-4715-ae31-1b987e8ef17d","Type":"ContainerStarted","Data":"960fb1228978b7e491872d526ee10b89e0ec84e38fce32d7598d5cd5907e83c1"} Mar 07 07:44:07 crc kubenswrapper[4941]: I0307 07:44:07.754170 4941 generic.go:334] "Generic (PLEG): container finished" podID="30cb74da-f324-4715-ae31-1b987e8ef17d" containerID="2fefe631df98173aa45252d754ec39191d78b7a1e96c5e8ef00de3499975b869" exitCode=0 Mar 07 07:44:07 crc kubenswrapper[4941]: I0307 07:44:07.754243 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547824-fzvrq" event={"ID":"30cb74da-f324-4715-ae31-1b987e8ef17d","Type":"ContainerDied","Data":"2fefe631df98173aa45252d754ec39191d78b7a1e96c5e8ef00de3499975b869"} Mar 07 07:44:09 crc kubenswrapper[4941]: I0307 07:44:09.002492 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547824-fzvrq" Mar 07 07:44:09 crc kubenswrapper[4941]: I0307 07:44:09.089949 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn7x2\" (UniqueName: \"kubernetes.io/projected/30cb74da-f324-4715-ae31-1b987e8ef17d-kube-api-access-gn7x2\") pod \"30cb74da-f324-4715-ae31-1b987e8ef17d\" (UID: \"30cb74da-f324-4715-ae31-1b987e8ef17d\") " Mar 07 07:44:09 crc kubenswrapper[4941]: I0307 07:44:09.095453 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30cb74da-f324-4715-ae31-1b987e8ef17d-kube-api-access-gn7x2" (OuterVolumeSpecName: "kube-api-access-gn7x2") pod "30cb74da-f324-4715-ae31-1b987e8ef17d" (UID: "30cb74da-f324-4715-ae31-1b987e8ef17d"). InnerVolumeSpecName "kube-api-access-gn7x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:44:09 crc kubenswrapper[4941]: I0307 07:44:09.191927 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn7x2\" (UniqueName: \"kubernetes.io/projected/30cb74da-f324-4715-ae31-1b987e8ef17d-kube-api-access-gn7x2\") on node \"crc\" DevicePath \"\"" Mar 07 07:44:09 crc kubenswrapper[4941]: I0307 07:44:09.771429 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547824-fzvrq" event={"ID":"30cb74da-f324-4715-ae31-1b987e8ef17d","Type":"ContainerDied","Data":"960fb1228978b7e491872d526ee10b89e0ec84e38fce32d7598d5cd5907e83c1"} Mar 07 07:44:09 crc kubenswrapper[4941]: I0307 07:44:09.771710 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="960fb1228978b7e491872d526ee10b89e0ec84e38fce32d7598d5cd5907e83c1" Mar 07 07:44:09 crc kubenswrapper[4941]: I0307 07:44:09.771587 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547824-fzvrq" Mar 07 07:44:10 crc kubenswrapper[4941]: I0307 07:44:10.067631 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547818-hpvpq"] Mar 07 07:44:10 crc kubenswrapper[4941]: I0307 07:44:10.073374 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547818-hpvpq"] Mar 07 07:44:11 crc kubenswrapper[4941]: I0307 07:44:11.964546 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1226f6c7-31f5-4897-b578-cd433a603dc0" path="/var/lib/kubelet/pods/1226f6c7-31f5-4897-b578-cd433a603dc0/volumes" Mar 07 07:44:14 crc kubenswrapper[4941]: I0307 07:44:14.955177 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:44:14 crc kubenswrapper[4941]: E0307 07:44:14.955761 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:44:25 crc kubenswrapper[4941]: I0307 07:44:25.954330 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:44:25 crc kubenswrapper[4941]: E0307 07:44:25.954985 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:44:38 crc kubenswrapper[4941]: I0307 07:44:38.954621 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:44:38 crc kubenswrapper[4941]: E0307 07:44:38.955542 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:44:49 crc kubenswrapper[4941]: I0307 07:44:49.955478 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:44:49 crc kubenswrapper[4941]: E0307 07:44:49.956724 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.171372 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4"] Mar 07 07:45:00 crc kubenswrapper[4941]: E0307 07:45:00.172283 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30cb74da-f324-4715-ae31-1b987e8ef17d" containerName="oc" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.172298 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="30cb74da-f324-4715-ae31-1b987e8ef17d" containerName="oc" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.172519 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="30cb74da-f324-4715-ae31-1b987e8ef17d" containerName="oc" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.173042 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.176976 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.178166 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.191122 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4"] Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.302483 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqfhg\" (UniqueName: \"kubernetes.io/projected/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-kube-api-access-sqfhg\") pod \"collect-profiles-29547825-8n7m4\" (UID: \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.302569 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-secret-volume\") pod \"collect-profiles-29547825-8n7m4\" (UID: \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.302736 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-config-volume\") pod \"collect-profiles-29547825-8n7m4\" (UID: \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.403841 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-config-volume\") pod \"collect-profiles-29547825-8n7m4\" (UID: \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.403956 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqfhg\" (UniqueName: \"kubernetes.io/projected/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-kube-api-access-sqfhg\") pod \"collect-profiles-29547825-8n7m4\" (UID: \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.404010 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-secret-volume\") pod \"collect-profiles-29547825-8n7m4\" (UID: \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.405752 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-config-volume\") pod \"collect-profiles-29547825-8n7m4\" (UID: \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.413133 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-secret-volume\") pod \"collect-profiles-29547825-8n7m4\" (UID: \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.431595 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqfhg\" (UniqueName: \"kubernetes.io/projected/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-kube-api-access-sqfhg\") pod \"collect-profiles-29547825-8n7m4\" (UID: \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.501171 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" Mar 07 07:45:00 crc kubenswrapper[4941]: I0307 07:45:00.976474 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4"] Mar 07 07:45:01 crc kubenswrapper[4941]: I0307 07:45:01.013365 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" event={"ID":"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb","Type":"ContainerStarted","Data":"534d545377975cf7ca6a21223f9cb6c853e7ff6722d3350f89fa3f82f184cd42"} Mar 07 07:45:02 crc kubenswrapper[4941]: I0307 07:45:02.022447 4941 generic.go:334] "Generic (PLEG): container finished" podID="ebaa4910-35c1-4caf-97ba-99d85ee5ebdb" containerID="ec199197aa9f11455dfba62844e90e5767215ef342d9dab9818afd208ff3d198" exitCode=0 Mar 07 07:45:02 crc kubenswrapper[4941]: I0307 07:45:02.022536 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" event={"ID":"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb","Type":"ContainerDied","Data":"ec199197aa9f11455dfba62844e90e5767215ef342d9dab9818afd208ff3d198"} Mar 07 07:45:03 crc kubenswrapper[4941]: I0307 07:45:03.312497 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" Mar 07 07:45:03 crc kubenswrapper[4941]: I0307 07:45:03.454341 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqfhg\" (UniqueName: \"kubernetes.io/projected/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-kube-api-access-sqfhg\") pod \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\" (UID: \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\") " Mar 07 07:45:03 crc kubenswrapper[4941]: I0307 07:45:03.454624 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-config-volume\") pod \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\" (UID: \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\") " Mar 07 07:45:03 crc kubenswrapper[4941]: I0307 07:45:03.454674 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-secret-volume\") pod \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\" (UID: \"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb\") " Mar 07 07:45:03 crc kubenswrapper[4941]: I0307 07:45:03.456673 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-config-volume" (OuterVolumeSpecName: "config-volume") pod "ebaa4910-35c1-4caf-97ba-99d85ee5ebdb" (UID: "ebaa4910-35c1-4caf-97ba-99d85ee5ebdb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:45:03 crc kubenswrapper[4941]: I0307 07:45:03.462970 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-kube-api-access-sqfhg" (OuterVolumeSpecName: "kube-api-access-sqfhg") pod "ebaa4910-35c1-4caf-97ba-99d85ee5ebdb" (UID: "ebaa4910-35c1-4caf-97ba-99d85ee5ebdb"). InnerVolumeSpecName "kube-api-access-sqfhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:45:03 crc kubenswrapper[4941]: I0307 07:45:03.463641 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ebaa4910-35c1-4caf-97ba-99d85ee5ebdb" (UID: "ebaa4910-35c1-4caf-97ba-99d85ee5ebdb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:45:03 crc kubenswrapper[4941]: I0307 07:45:03.556846 4941 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:45:03 crc kubenswrapper[4941]: I0307 07:45:03.556892 4941 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:45:03 crc kubenswrapper[4941]: I0307 07:45:03.556911 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqfhg\" (UniqueName: \"kubernetes.io/projected/ebaa4910-35c1-4caf-97ba-99d85ee5ebdb-kube-api-access-sqfhg\") on node \"crc\" DevicePath \"\"" Mar 07 07:45:03 crc kubenswrapper[4941]: I0307 07:45:03.961037 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:45:03 crc kubenswrapper[4941]: E0307 07:45:03.961491 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:45:04 crc kubenswrapper[4941]: I0307 07:45:04.039706 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" event={"ID":"ebaa4910-35c1-4caf-97ba-99d85ee5ebdb","Type":"ContainerDied","Data":"534d545377975cf7ca6a21223f9cb6c853e7ff6722d3350f89fa3f82f184cd42"} Mar 07 07:45:04 crc kubenswrapper[4941]: I0307 07:45:04.039750 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-8n7m4" Mar 07 07:45:04 crc kubenswrapper[4941]: I0307 07:45:04.039772 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="534d545377975cf7ca6a21223f9cb6c853e7ff6722d3350f89fa3f82f184cd42" Mar 07 07:45:04 crc kubenswrapper[4941]: I0307 07:45:04.408337 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss"] Mar 07 07:45:04 crc kubenswrapper[4941]: I0307 07:45:04.414063 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547780-nlhss"] Mar 07 07:45:05 crc kubenswrapper[4941]: I0307 07:45:05.965204 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd63482d-26a4-4246-8dbc-bad3c19aff6a" path="/var/lib/kubelet/pods/fd63482d-26a4-4246-8dbc-bad3c19aff6a/volumes" Mar 07 07:45:09 crc kubenswrapper[4941]: I0307 07:45:09.037231 4941 scope.go:117] "RemoveContainer" containerID="08a0a52c6e27a3e21d4515d4c94a5193d3627154fb6406255e04023201b7b8c7" Mar 07 07:45:09 crc kubenswrapper[4941]: I0307 07:45:09.081165 4941 scope.go:117] "RemoveContainer" containerID="d1544d2a6907322c8e6755a159ebb752f8b1c3cbea675954e6310f3b63370818" Mar 07 07:45:17 crc kubenswrapper[4941]: I0307 07:45:17.955344 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:45:17 crc kubenswrapper[4941]: E0307 07:45:17.958944 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:45:30 crc kubenswrapper[4941]: I0307 07:45:30.955933 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:45:30 crc kubenswrapper[4941]: E0307 07:45:30.957396 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:45:44 crc kubenswrapper[4941]: I0307 07:45:44.954629 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:45:44 crc kubenswrapper[4941]: E0307 07:45:44.955388 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:45:58 crc kubenswrapper[4941]: I0307 07:45:58.954528 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:45:58 crc kubenswrapper[4941]: E0307 07:45:58.955477 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.208882 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547826-xnxbb"] Mar 07 07:46:00 crc kubenswrapper[4941]: E0307 07:46:00.209374 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebaa4910-35c1-4caf-97ba-99d85ee5ebdb" containerName="collect-profiles" Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.209394 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebaa4910-35c1-4caf-97ba-99d85ee5ebdb" containerName="collect-profiles" Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.209649 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebaa4910-35c1-4caf-97ba-99d85ee5ebdb" containerName="collect-profiles" Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.210740 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547826-xnxbb" Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.215703 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.215706 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.215861 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.219329 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547826-xnxbb"] Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.265621 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsb5j\" (UniqueName: \"kubernetes.io/projected/85849ea0-d0b0-4911-a00f-53ac4b58d95c-kube-api-access-gsb5j\") pod \"auto-csr-approver-29547826-xnxbb\" (UID: \"85849ea0-d0b0-4911-a00f-53ac4b58d95c\") " pod="openshift-infra/auto-csr-approver-29547826-xnxbb" Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.366928 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsb5j\" (UniqueName: \"kubernetes.io/projected/85849ea0-d0b0-4911-a00f-53ac4b58d95c-kube-api-access-gsb5j\") pod \"auto-csr-approver-29547826-xnxbb\" (UID: \"85849ea0-d0b0-4911-a00f-53ac4b58d95c\") " pod="openshift-infra/auto-csr-approver-29547826-xnxbb" Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.401165 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsb5j\" (UniqueName: \"kubernetes.io/projected/85849ea0-d0b0-4911-a00f-53ac4b58d95c-kube-api-access-gsb5j\") pod \"auto-csr-approver-29547826-xnxbb\" (UID: \"85849ea0-d0b0-4911-a00f-53ac4b58d95c\") " pod="openshift-infra/auto-csr-approver-29547826-xnxbb" Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.550528 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547826-xnxbb" Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.861649 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547826-xnxbb"] Mar 07 07:46:00 crc kubenswrapper[4941]: W0307 07:46:00.867620 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85849ea0_d0b0_4911_a00f_53ac4b58d95c.slice/crio-30f31d9ae10bdc1d366661162440157ddecac36926b861acc60fda5e7bc469b4 WatchSource:0}: Error finding container 30f31d9ae10bdc1d366661162440157ddecac36926b861acc60fda5e7bc469b4: Status 404 returned error can't find the container with id 30f31d9ae10bdc1d366661162440157ddecac36926b861acc60fda5e7bc469b4 Mar 07 07:46:00 crc kubenswrapper[4941]: I0307 07:46:00.974937 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547826-xnxbb" event={"ID":"85849ea0-d0b0-4911-a00f-53ac4b58d95c","Type":"ContainerStarted","Data":"30f31d9ae10bdc1d366661162440157ddecac36926b861acc60fda5e7bc469b4"} Mar 07 07:46:02 crc kubenswrapper[4941]: I0307 07:46:02.993981 4941 generic.go:334] "Generic (PLEG): container finished" podID="85849ea0-d0b0-4911-a00f-53ac4b58d95c" containerID="6cb1618a65291283f355f148744de26d668facf92d8725ff47caec1d3e5c2582" exitCode=0 Mar 07 07:46:02 crc kubenswrapper[4941]: I0307 07:46:02.994629 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547826-xnxbb" event={"ID":"85849ea0-d0b0-4911-a00f-53ac4b58d95c","Type":"ContainerDied","Data":"6cb1618a65291283f355f148744de26d668facf92d8725ff47caec1d3e5c2582"} Mar 07 07:46:04 crc kubenswrapper[4941]: I0307 07:46:04.339114 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547826-xnxbb" Mar 07 07:46:04 crc kubenswrapper[4941]: I0307 07:46:04.424617 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsb5j\" (UniqueName: \"kubernetes.io/projected/85849ea0-d0b0-4911-a00f-53ac4b58d95c-kube-api-access-gsb5j\") pod \"85849ea0-d0b0-4911-a00f-53ac4b58d95c\" (UID: \"85849ea0-d0b0-4911-a00f-53ac4b58d95c\") " Mar 07 07:46:04 crc kubenswrapper[4941]: I0307 07:46:04.430521 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85849ea0-d0b0-4911-a00f-53ac4b58d95c-kube-api-access-gsb5j" (OuterVolumeSpecName: "kube-api-access-gsb5j") pod "85849ea0-d0b0-4911-a00f-53ac4b58d95c" (UID: "85849ea0-d0b0-4911-a00f-53ac4b58d95c"). InnerVolumeSpecName "kube-api-access-gsb5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:46:04 crc kubenswrapper[4941]: I0307 07:46:04.525730 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsb5j\" (UniqueName: \"kubernetes.io/projected/85849ea0-d0b0-4911-a00f-53ac4b58d95c-kube-api-access-gsb5j\") on node \"crc\" DevicePath \"\"" Mar 07 07:46:05 crc kubenswrapper[4941]: I0307 07:46:05.016391 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547826-xnxbb" event={"ID":"85849ea0-d0b0-4911-a00f-53ac4b58d95c","Type":"ContainerDied","Data":"30f31d9ae10bdc1d366661162440157ddecac36926b861acc60fda5e7bc469b4"} Mar 07 07:46:05 crc kubenswrapper[4941]: I0307 07:46:05.016487 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30f31d9ae10bdc1d366661162440157ddecac36926b861acc60fda5e7bc469b4" Mar 07 07:46:05 crc kubenswrapper[4941]: I0307 07:46:05.016564 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547826-xnxbb" Mar 07 07:46:05 crc kubenswrapper[4941]: I0307 07:46:05.428110 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547820-7d9rb"] Mar 07 07:46:05 crc kubenswrapper[4941]: I0307 07:46:05.436145 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547820-7d9rb"] Mar 07 07:46:05 crc kubenswrapper[4941]: I0307 07:46:05.964360 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d37680-772d-48a1-b229-4e91565a28ad" path="/var/lib/kubelet/pods/54d37680-772d-48a1-b229-4e91565a28ad/volumes" Mar 07 07:46:09 crc kubenswrapper[4941]: I0307 07:46:09.182589 4941 scope.go:117] "RemoveContainer" containerID="4ece2f8ba8ad11a2f62165c6184fdc93539e9d30b454fcccbcb58bde8698c1ee" Mar 07 07:46:09 crc kubenswrapper[4941]: I0307 07:46:09.954346 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:46:09 crc kubenswrapper[4941]: E0307 07:46:09.954756 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:46:22 crc kubenswrapper[4941]: I0307 07:46:22.955706 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:46:23 crc kubenswrapper[4941]: I0307 07:46:23.164944 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"c6dba3014bf81fa7753ec565ebbd4a7a4058e13da0db9e277e4e84371306df7a"} Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.154310 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547828-899jh"] Mar 07 07:48:00 crc kubenswrapper[4941]: E0307 07:48:00.155836 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85849ea0-d0b0-4911-a00f-53ac4b58d95c" containerName="oc" Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.155859 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="85849ea0-d0b0-4911-a00f-53ac4b58d95c" containerName="oc" Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.156094 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="85849ea0-d0b0-4911-a00f-53ac4b58d95c" containerName="oc" Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.156899 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547828-899jh" Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.159354 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.159812 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.159814 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.176582 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547828-899jh"] Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.208311 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc9rq\" (UniqueName: \"kubernetes.io/projected/a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb-kube-api-access-sc9rq\") pod \"auto-csr-approver-29547828-899jh\" (UID: \"a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb\") " pod="openshift-infra/auto-csr-approver-29547828-899jh" Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.309188 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc9rq\" (UniqueName: \"kubernetes.io/projected/a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb-kube-api-access-sc9rq\") pod \"auto-csr-approver-29547828-899jh\" (UID: \"a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb\") " pod="openshift-infra/auto-csr-approver-29547828-899jh" Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.328294 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc9rq\" (UniqueName: \"kubernetes.io/projected/a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb-kube-api-access-sc9rq\") pod \"auto-csr-approver-29547828-899jh\" (UID: \"a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb\") " pod="openshift-infra/auto-csr-approver-29547828-899jh" Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.488796 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547828-899jh" Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.951108 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547828-899jh"] Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.952811 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:48:00 crc kubenswrapper[4941]: I0307 07:48:00.980190 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547828-899jh" event={"ID":"a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb","Type":"ContainerStarted","Data":"90a3e5bf9afb05d0526ec60f84324915acfadda6e1030ebf68c4ea8f64616532"} Mar 07 07:48:03 crc kubenswrapper[4941]: I0307 07:48:03.008708 4941 generic.go:334] "Generic (PLEG): container finished" podID="a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb" containerID="c912267d31265560e3a721d06f09b02eac33acc64eda213e9bac2a6ceceda0ef" exitCode=0 Mar 07 07:48:03 crc kubenswrapper[4941]: I0307 07:48:03.008827 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547828-899jh" event={"ID":"a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb","Type":"ContainerDied","Data":"c912267d31265560e3a721d06f09b02eac33acc64eda213e9bac2a6ceceda0ef"} Mar 07 07:48:04 crc kubenswrapper[4941]: I0307 07:48:04.343110 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547828-899jh" Mar 07 07:48:04 crc kubenswrapper[4941]: I0307 07:48:04.470162 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc9rq\" (UniqueName: \"kubernetes.io/projected/a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb-kube-api-access-sc9rq\") pod \"a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb\" (UID: \"a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb\") " Mar 07 07:48:04 crc kubenswrapper[4941]: I0307 07:48:04.478681 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb-kube-api-access-sc9rq" (OuterVolumeSpecName: "kube-api-access-sc9rq") pod "a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb" (UID: "a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb"). InnerVolumeSpecName "kube-api-access-sc9rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:48:04 crc kubenswrapper[4941]: I0307 07:48:04.572107 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc9rq\" (UniqueName: \"kubernetes.io/projected/a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb-kube-api-access-sc9rq\") on node \"crc\" DevicePath \"\"" Mar 07 07:48:05 crc kubenswrapper[4941]: I0307 07:48:05.029720 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547828-899jh" event={"ID":"a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb","Type":"ContainerDied","Data":"90a3e5bf9afb05d0526ec60f84324915acfadda6e1030ebf68c4ea8f64616532"} Mar 07 07:48:05 crc kubenswrapper[4941]: I0307 07:48:05.029986 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90a3e5bf9afb05d0526ec60f84324915acfadda6e1030ebf68c4ea8f64616532" Mar 07 07:48:05 crc kubenswrapper[4941]: I0307 07:48:05.029756 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547828-899jh" Mar 07 07:48:05 crc kubenswrapper[4941]: I0307 07:48:05.453769 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547822-mvrdm"] Mar 07 07:48:05 crc kubenswrapper[4941]: I0307 07:48:05.459896 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547822-mvrdm"] Mar 07 07:48:05 crc kubenswrapper[4941]: I0307 07:48:05.963193 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50992226-d92b-41d6-b68a-71a56abb9793" path="/var/lib/kubelet/pods/50992226-d92b-41d6-b68a-71a56abb9793/volumes" Mar 07 07:48:09 crc kubenswrapper[4941]: I0307 07:48:09.265904 4941 scope.go:117] "RemoveContainer" containerID="f2fa7da72412c1dc52e3116e58ecb16b11a0ef5baaade900c35732dfcb2dac68" Mar 07 07:48:40 crc kubenswrapper[4941]: I0307 07:48:40.313728 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:48:40 crc kubenswrapper[4941]: I0307 07:48:40.314271 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.428844 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z6t4x"] Mar 07 07:49:05 crc kubenswrapper[4941]: E0307 07:49:05.429576 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb" containerName="oc" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.429588 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb" containerName="oc" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.429710 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb" containerName="oc" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.430671 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.445194 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z6t4x"] Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.454372 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/0f94361e-b41d-4b10-ae8c-d85aa833faaa-kube-api-access-k7vdz\") pod \"certified-operators-z6t4x\" (UID: \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\") " pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.454494 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f94361e-b41d-4b10-ae8c-d85aa833faaa-catalog-content\") pod \"certified-operators-z6t4x\" (UID: \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\") " pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.454549 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f94361e-b41d-4b10-ae8c-d85aa833faaa-utilities\") pod \"certified-operators-z6t4x\" (UID: \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\") " pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.555725 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/0f94361e-b41d-4b10-ae8c-d85aa833faaa-kube-api-access-k7vdz\") pod \"certified-operators-z6t4x\" (UID: \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\") " pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.556112 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f94361e-b41d-4b10-ae8c-d85aa833faaa-catalog-content\") pod \"certified-operators-z6t4x\" (UID: \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\") " pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.556252 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f94361e-b41d-4b10-ae8c-d85aa833faaa-utilities\") pod \"certified-operators-z6t4x\" (UID: \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\") " pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.556617 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f94361e-b41d-4b10-ae8c-d85aa833faaa-catalog-content\") pod \"certified-operators-z6t4x\" (UID: \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\") " pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.556628 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f94361e-b41d-4b10-ae8c-d85aa833faaa-utilities\") pod \"certified-operators-z6t4x\" (UID: \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\") " pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.576139 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/0f94361e-b41d-4b10-ae8c-d85aa833faaa-kube-api-access-k7vdz\") pod \"certified-operators-z6t4x\" (UID: \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\") " pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:05 crc kubenswrapper[4941]: I0307 07:49:05.757418 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:06 crc kubenswrapper[4941]: I0307 07:49:06.253930 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z6t4x"] Mar 07 07:49:06 crc kubenswrapper[4941]: I0307 07:49:06.806014 4941 generic.go:334] "Generic (PLEG): container finished" podID="0f94361e-b41d-4b10-ae8c-d85aa833faaa" containerID="fcca9beb073cd77b31da16825f7eab4c2a658dd0d40ccc3784bd35009ec48d51" exitCode=0 Mar 07 07:49:06 crc kubenswrapper[4941]: I0307 07:49:06.806290 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6t4x" event={"ID":"0f94361e-b41d-4b10-ae8c-d85aa833faaa","Type":"ContainerDied","Data":"fcca9beb073cd77b31da16825f7eab4c2a658dd0d40ccc3784bd35009ec48d51"} Mar 07 07:49:06 crc kubenswrapper[4941]: I0307 07:49:06.806321 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6t4x" event={"ID":"0f94361e-b41d-4b10-ae8c-d85aa833faaa","Type":"ContainerStarted","Data":"10bd4513f803e2f95e7e0b8eccb681de93d9aadc7b50baf68e243df04f552a53"} Mar 07 07:49:06 crc kubenswrapper[4941]: I0307 07:49:06.855309 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2jrfp"] Mar 07 07:49:06 crc kubenswrapper[4941]: I0307 07:49:06.868123 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:06 crc kubenswrapper[4941]: I0307 07:49:06.868956 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2jrfp"] Mar 07 07:49:07 crc kubenswrapper[4941]: I0307 07:49:07.012855 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1800a89b-3818-464c-a992-ca10ae3f6a43-catalog-content\") pod \"community-operators-2jrfp\" (UID: \"1800a89b-3818-464c-a992-ca10ae3f6a43\") " pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:07 crc kubenswrapper[4941]: I0307 07:49:07.012929 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1800a89b-3818-464c-a992-ca10ae3f6a43-utilities\") pod \"community-operators-2jrfp\" (UID: \"1800a89b-3818-464c-a992-ca10ae3f6a43\") " pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:07 crc kubenswrapper[4941]: I0307 07:49:07.013004 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmw7k\" (UniqueName: \"kubernetes.io/projected/1800a89b-3818-464c-a992-ca10ae3f6a43-kube-api-access-lmw7k\") pod \"community-operators-2jrfp\" (UID: \"1800a89b-3818-464c-a992-ca10ae3f6a43\") " pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:07 crc kubenswrapper[4941]: I0307 07:49:07.114656 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmw7k\" (UniqueName: \"kubernetes.io/projected/1800a89b-3818-464c-a992-ca10ae3f6a43-kube-api-access-lmw7k\") pod \"community-operators-2jrfp\" (UID: \"1800a89b-3818-464c-a992-ca10ae3f6a43\") " pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:07 crc kubenswrapper[4941]: I0307 07:49:07.114804 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1800a89b-3818-464c-a992-ca10ae3f6a43-catalog-content\") pod \"community-operators-2jrfp\" (UID: \"1800a89b-3818-464c-a992-ca10ae3f6a43\") " pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:07 crc kubenswrapper[4941]: I0307 07:49:07.114831 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1800a89b-3818-464c-a992-ca10ae3f6a43-utilities\") pod \"community-operators-2jrfp\" (UID: \"1800a89b-3818-464c-a992-ca10ae3f6a43\") " pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:07 crc kubenswrapper[4941]: I0307 07:49:07.116330 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1800a89b-3818-464c-a992-ca10ae3f6a43-catalog-content\") pod \"community-operators-2jrfp\" (UID: \"1800a89b-3818-464c-a992-ca10ae3f6a43\") " pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:07 crc kubenswrapper[4941]: I0307 07:49:07.116655 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1800a89b-3818-464c-a992-ca10ae3f6a43-utilities\") pod \"community-operators-2jrfp\" (UID: \"1800a89b-3818-464c-a992-ca10ae3f6a43\") " pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:07 crc kubenswrapper[4941]: I0307 07:49:07.136076 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmw7k\" (UniqueName: \"kubernetes.io/projected/1800a89b-3818-464c-a992-ca10ae3f6a43-kube-api-access-lmw7k\") pod \"community-operators-2jrfp\" (UID: \"1800a89b-3818-464c-a992-ca10ae3f6a43\") " pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:07 crc kubenswrapper[4941]: I0307 07:49:07.331565 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:07 crc kubenswrapper[4941]: I0307 07:49:07.801023 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2jrfp"] Mar 07 07:49:07 crc kubenswrapper[4941]: I0307 07:49:07.824981 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6t4x" event={"ID":"0f94361e-b41d-4b10-ae8c-d85aa833faaa","Type":"ContainerStarted","Data":"94b983e366ddb474cbb0f9ae6051a4f4d8cbdfff4630e77b8ce2b8345ad7a7d7"} Mar 07 07:49:08 crc kubenswrapper[4941]: I0307 07:49:08.837901 4941 generic.go:334] "Generic (PLEG): container finished" podID="1800a89b-3818-464c-a992-ca10ae3f6a43" containerID="723a039ddd9cd8779e38e8ab6259fe01eaf89bd81af0a691406fc124b9c64027" exitCode=0 Mar 07 07:49:08 crc kubenswrapper[4941]: I0307 07:49:08.837982 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jrfp" event={"ID":"1800a89b-3818-464c-a992-ca10ae3f6a43","Type":"ContainerDied","Data":"723a039ddd9cd8779e38e8ab6259fe01eaf89bd81af0a691406fc124b9c64027"} Mar 07 07:49:08 crc kubenswrapper[4941]: I0307 07:49:08.838335 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jrfp" event={"ID":"1800a89b-3818-464c-a992-ca10ae3f6a43","Type":"ContainerStarted","Data":"f527c3ab15fd1f9e44036b7e5c9b1e7ef537c0679087fd3755ef2bc1285accc6"} Mar 07 07:49:08 crc kubenswrapper[4941]: I0307 07:49:08.842281 4941 generic.go:334] "Generic (PLEG): container finished" podID="0f94361e-b41d-4b10-ae8c-d85aa833faaa" containerID="94b983e366ddb474cbb0f9ae6051a4f4d8cbdfff4630e77b8ce2b8345ad7a7d7" exitCode=0 Mar 07 07:49:08 crc kubenswrapper[4941]: I0307 07:49:08.842382 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6t4x" event={"ID":"0f94361e-b41d-4b10-ae8c-d85aa833faaa","Type":"ContainerDied","Data":"94b983e366ddb474cbb0f9ae6051a4f4d8cbdfff4630e77b8ce2b8345ad7a7d7"} Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.633454 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vjtdg"] Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.635025 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.661574 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vjtdg"] Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.748545 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5e29523-87f6-4424-a7da-0b2e62ec857f-utilities\") pod \"redhat-marketplace-vjtdg\" (UID: \"c5e29523-87f6-4424-a7da-0b2e62ec857f\") " pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.748620 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5e29523-87f6-4424-a7da-0b2e62ec857f-catalog-content\") pod \"redhat-marketplace-vjtdg\" (UID: \"c5e29523-87f6-4424-a7da-0b2e62ec857f\") " pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.748667 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84txq\" (UniqueName: \"kubernetes.io/projected/c5e29523-87f6-4424-a7da-0b2e62ec857f-kube-api-access-84txq\") pod \"redhat-marketplace-vjtdg\" (UID: \"c5e29523-87f6-4424-a7da-0b2e62ec857f\") " pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.849927 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5e29523-87f6-4424-a7da-0b2e62ec857f-catalog-content\") pod \"redhat-marketplace-vjtdg\" (UID: \"c5e29523-87f6-4424-a7da-0b2e62ec857f\") " pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.850385 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84txq\" (UniqueName: \"kubernetes.io/projected/c5e29523-87f6-4424-a7da-0b2e62ec857f-kube-api-access-84txq\") pod \"redhat-marketplace-vjtdg\" (UID: \"c5e29523-87f6-4424-a7da-0b2e62ec857f\") " pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.850493 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5e29523-87f6-4424-a7da-0b2e62ec857f-utilities\") pod \"redhat-marketplace-vjtdg\" (UID: \"c5e29523-87f6-4424-a7da-0b2e62ec857f\") " pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.850528 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5e29523-87f6-4424-a7da-0b2e62ec857f-catalog-content\") pod \"redhat-marketplace-vjtdg\" (UID: \"c5e29523-87f6-4424-a7da-0b2e62ec857f\") " pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.850740 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5e29523-87f6-4424-a7da-0b2e62ec857f-utilities\") pod \"redhat-marketplace-vjtdg\" (UID: \"c5e29523-87f6-4424-a7da-0b2e62ec857f\") " pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.853609 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6t4x" event={"ID":"0f94361e-b41d-4b10-ae8c-d85aa833faaa","Type":"ContainerStarted","Data":"d604ad7dde3dcdc7ff3808de49fdac469a23376d4960041051e045c21b5020b2"} Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.856792 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jrfp" event={"ID":"1800a89b-3818-464c-a992-ca10ae3f6a43","Type":"ContainerStarted","Data":"c18bfc3a96e0b47aca621b94462367821296623ded4cf5d55409b0109f361f84"} Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.878945 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z6t4x" podStartSLOduration=2.414610563 podStartE2EDuration="4.878921431s" podCreationTimestamp="2026-03-07 07:49:05 +0000 UTC" firstStartedPulling="2026-03-07 07:49:06.808035797 +0000 UTC m=+3443.760401252" lastFinishedPulling="2026-03-07 07:49:09.272346615 +0000 UTC m=+3446.224712120" observedRunningTime="2026-03-07 07:49:09.876997753 +0000 UTC m=+3446.829363218" watchObservedRunningTime="2026-03-07 07:49:09.878921431 +0000 UTC m=+3446.831286896" Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.887345 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84txq\" (UniqueName: \"kubernetes.io/projected/c5e29523-87f6-4424-a7da-0b2e62ec857f-kube-api-access-84txq\") pod \"redhat-marketplace-vjtdg\" (UID: \"c5e29523-87f6-4424-a7da-0b2e62ec857f\") " pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:09 crc kubenswrapper[4941]: I0307 07:49:09.960560 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.313637 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.314066 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.389291 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vjtdg"] Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.632128 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s8jzs"] Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.635877 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.640832 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8jzs"] Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.664696 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b319784-3b37-4ffa-b81b-5db086387b49-utilities\") pod \"redhat-operators-s8jzs\" (UID: \"7b319784-3b37-4ffa-b81b-5db086387b49\") " pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.664742 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dg2g\" (UniqueName: \"kubernetes.io/projected/7b319784-3b37-4ffa-b81b-5db086387b49-kube-api-access-6dg2g\") pod \"redhat-operators-s8jzs\" (UID: \"7b319784-3b37-4ffa-b81b-5db086387b49\") " pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.664849 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b319784-3b37-4ffa-b81b-5db086387b49-catalog-content\") pod \"redhat-operators-s8jzs\" (UID: \"7b319784-3b37-4ffa-b81b-5db086387b49\") " pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.766208 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b319784-3b37-4ffa-b81b-5db086387b49-catalog-content\") pod \"redhat-operators-s8jzs\" (UID: \"7b319784-3b37-4ffa-b81b-5db086387b49\") " pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.766369 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b319784-3b37-4ffa-b81b-5db086387b49-utilities\") pod \"redhat-operators-s8jzs\" (UID: \"7b319784-3b37-4ffa-b81b-5db086387b49\") " pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.766413 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dg2g\" (UniqueName: \"kubernetes.io/projected/7b319784-3b37-4ffa-b81b-5db086387b49-kube-api-access-6dg2g\") pod \"redhat-operators-s8jzs\" (UID: \"7b319784-3b37-4ffa-b81b-5db086387b49\") " pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.766875 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b319784-3b37-4ffa-b81b-5db086387b49-catalog-content\") pod \"redhat-operators-s8jzs\" (UID: \"7b319784-3b37-4ffa-b81b-5db086387b49\") " pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.767019 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b319784-3b37-4ffa-b81b-5db086387b49-utilities\") pod \"redhat-operators-s8jzs\" (UID: \"7b319784-3b37-4ffa-b81b-5db086387b49\") " pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.791206 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dg2g\" (UniqueName: \"kubernetes.io/projected/7b319784-3b37-4ffa-b81b-5db086387b49-kube-api-access-6dg2g\") pod \"redhat-operators-s8jzs\" (UID: \"7b319784-3b37-4ffa-b81b-5db086387b49\") " pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.865922 4941 generic.go:334] "Generic (PLEG): container finished" podID="c5e29523-87f6-4424-a7da-0b2e62ec857f" containerID="5c26e97a4474f6459bf9966586255c34e25d41d01ccd64f05a96c68555b6c7e3" exitCode=0 Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.866028 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vjtdg" event={"ID":"c5e29523-87f6-4424-a7da-0b2e62ec857f","Type":"ContainerDied","Data":"5c26e97a4474f6459bf9966586255c34e25d41d01ccd64f05a96c68555b6c7e3"} Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.866322 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vjtdg" event={"ID":"c5e29523-87f6-4424-a7da-0b2e62ec857f","Type":"ContainerStarted","Data":"8e74c083782f197962b3fa83487dbe3dd4b1e8ae70520be7d438d7ee9008bef4"} Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.868682 4941 generic.go:334] "Generic (PLEG): container finished" podID="1800a89b-3818-464c-a992-ca10ae3f6a43" containerID="c18bfc3a96e0b47aca621b94462367821296623ded4cf5d55409b0109f361f84" exitCode=0 Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.868770 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jrfp" event={"ID":"1800a89b-3818-464c-a992-ca10ae3f6a43","Type":"ContainerDied","Data":"c18bfc3a96e0b47aca621b94462367821296623ded4cf5d55409b0109f361f84"} Mar 07 07:49:10 crc kubenswrapper[4941]: I0307 07:49:10.975353 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:11 crc kubenswrapper[4941]: I0307 07:49:11.884936 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jrfp" event={"ID":"1800a89b-3818-464c-a992-ca10ae3f6a43","Type":"ContainerStarted","Data":"2e4308794c1d3e73bc45dc7aa2036c40aaade667cfff6b76a0e5783962663890"} Mar 07 07:49:11 crc kubenswrapper[4941]: I0307 07:49:11.891674 4941 generic.go:334] "Generic (PLEG): container finished" podID="c5e29523-87f6-4424-a7da-0b2e62ec857f" containerID="9d54afbcefc23e1a1df4649f4a5017e8385e62dd74589e4a5d50e2c58aeb07ec" exitCode=0 Mar 07 07:49:11 crc kubenswrapper[4941]: I0307 07:49:11.891729 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vjtdg" event={"ID":"c5e29523-87f6-4424-a7da-0b2e62ec857f","Type":"ContainerDied","Data":"9d54afbcefc23e1a1df4649f4a5017e8385e62dd74589e4a5d50e2c58aeb07ec"} Mar 07 07:49:11 crc kubenswrapper[4941]: I0307 07:49:11.907305 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2jrfp" podStartSLOduration=3.262516131 podStartE2EDuration="5.907288012s" podCreationTimestamp="2026-03-07 07:49:06 +0000 UTC" firstStartedPulling="2026-03-07 07:49:08.841562406 +0000 UTC m=+3445.793927891" lastFinishedPulling="2026-03-07 07:49:11.486334297 +0000 UTC m=+3448.438699772" observedRunningTime="2026-03-07 07:49:11.902976545 +0000 UTC m=+3448.855342010" watchObservedRunningTime="2026-03-07 07:49:11.907288012 +0000 UTC m=+3448.859653497" Mar 07 07:49:12 crc kubenswrapper[4941]: I0307 07:49:12.012628 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8jzs"] Mar 07 07:49:12 crc kubenswrapper[4941]: I0307 07:49:12.903078 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vjtdg" event={"ID":"c5e29523-87f6-4424-a7da-0b2e62ec857f","Type":"ContainerStarted","Data":"61f9089709db6a3ebf1811bebdbe8cba513bf00fc02399f0daef05caaf407be7"} Mar 07 07:49:12 crc kubenswrapper[4941]: I0307 07:49:12.905347 4941 generic.go:334] "Generic (PLEG): container finished" podID="7b319784-3b37-4ffa-b81b-5db086387b49" containerID="f102fde7f0e466e853405767cc63bd2701538cab4df16c0265bcf197e9f56d3a" exitCode=0 Mar 07 07:49:12 crc kubenswrapper[4941]: I0307 07:49:12.905580 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8jzs" event={"ID":"7b319784-3b37-4ffa-b81b-5db086387b49","Type":"ContainerDied","Data":"f102fde7f0e466e853405767cc63bd2701538cab4df16c0265bcf197e9f56d3a"} Mar 07 07:49:12 crc kubenswrapper[4941]: I0307 07:49:12.905714 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8jzs" event={"ID":"7b319784-3b37-4ffa-b81b-5db086387b49","Type":"ContainerStarted","Data":"61d8d86418f68ccdf7f2191c97f983138e8fef6a800da7f0e8de93e8da39d63b"} Mar 07 07:49:12 crc kubenswrapper[4941]: I0307 07:49:12.957237 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vjtdg" podStartSLOduration=2.542852528 podStartE2EDuration="3.957213609s" podCreationTimestamp="2026-03-07 07:49:09 +0000 UTC" firstStartedPulling="2026-03-07 07:49:10.868498151 +0000 UTC m=+3447.820863616" lastFinishedPulling="2026-03-07 07:49:12.282859232 +0000 UTC m=+3449.235224697" observedRunningTime="2026-03-07 07:49:12.930616959 +0000 UTC m=+3449.882982444" watchObservedRunningTime="2026-03-07 07:49:12.957213609 +0000 UTC m=+3449.909579104" Mar 07 07:49:13 crc kubenswrapper[4941]: I0307 07:49:13.913839 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8jzs" event={"ID":"7b319784-3b37-4ffa-b81b-5db086387b49","Type":"ContainerStarted","Data":"c279f290a68976b90bda9f27ae18de373259ab86ba7fd00db47d788a4bd9fb24"} Mar 07 07:49:14 crc kubenswrapper[4941]: I0307 07:49:14.926616 4941 generic.go:334] "Generic (PLEG): container finished" podID="7b319784-3b37-4ffa-b81b-5db086387b49" containerID="c279f290a68976b90bda9f27ae18de373259ab86ba7fd00db47d788a4bd9fb24" exitCode=0 Mar 07 07:49:14 crc kubenswrapper[4941]: I0307 07:49:14.926754 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8jzs" event={"ID":"7b319784-3b37-4ffa-b81b-5db086387b49","Type":"ContainerDied","Data":"c279f290a68976b90bda9f27ae18de373259ab86ba7fd00db47d788a4bd9fb24"} Mar 07 07:49:15 crc kubenswrapper[4941]: I0307 07:49:15.757846 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:15 crc kubenswrapper[4941]: I0307 07:49:15.757915 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:15 crc kubenswrapper[4941]: I0307 07:49:15.804499 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:15 crc kubenswrapper[4941]: I0307 07:49:15.970838 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:16 crc kubenswrapper[4941]: I0307 07:49:16.946008 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8jzs" event={"ID":"7b319784-3b37-4ffa-b81b-5db086387b49","Type":"ContainerStarted","Data":"a7b7e73f48c4534af2020d0c2132af8624f3fb57fd37d9090ec77cb93c70e35c"} Mar 07 07:49:16 crc kubenswrapper[4941]: I0307 07:49:16.965336 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s8jzs" podStartSLOduration=3.841574342 podStartE2EDuration="6.965320096s" podCreationTimestamp="2026-03-07 07:49:10 +0000 UTC" firstStartedPulling="2026-03-07 07:49:12.907364913 +0000 UTC m=+3449.859730418" lastFinishedPulling="2026-03-07 07:49:16.031110717 +0000 UTC m=+3452.983476172" observedRunningTime="2026-03-07 07:49:16.961377178 +0000 UTC m=+3453.913742633" watchObservedRunningTime="2026-03-07 07:49:16.965320096 +0000 UTC m=+3453.917685561" Mar 07 07:49:17 crc kubenswrapper[4941]: I0307 07:49:17.332600 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:17 crc kubenswrapper[4941]: I0307 07:49:17.332697 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:17 crc kubenswrapper[4941]: I0307 07:49:17.382163 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:17 crc kubenswrapper[4941]: I0307 07:49:17.649481 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z6t4x"] Mar 07 07:49:17 crc kubenswrapper[4941]: I0307 07:49:17.953123 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z6t4x" podUID="0f94361e-b41d-4b10-ae8c-d85aa833faaa" containerName="registry-server" containerID="cri-o://d604ad7dde3dcdc7ff3808de49fdac469a23376d4960041051e045c21b5020b2" gracePeriod=2 Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.004205 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.360190 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.485602 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/0f94361e-b41d-4b10-ae8c-d85aa833faaa-kube-api-access-k7vdz\") pod \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\" (UID: \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\") " Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.485761 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f94361e-b41d-4b10-ae8c-d85aa833faaa-utilities\") pod \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\" (UID: \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\") " Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.485787 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f94361e-b41d-4b10-ae8c-d85aa833faaa-catalog-content\") pod \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\" (UID: \"0f94361e-b41d-4b10-ae8c-d85aa833faaa\") " Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.487843 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f94361e-b41d-4b10-ae8c-d85aa833faaa-utilities" (OuterVolumeSpecName: "utilities") pod "0f94361e-b41d-4b10-ae8c-d85aa833faaa" (UID: "0f94361e-b41d-4b10-ae8c-d85aa833faaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.491660 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f94361e-b41d-4b10-ae8c-d85aa833faaa-kube-api-access-k7vdz" (OuterVolumeSpecName: "kube-api-access-k7vdz") pod "0f94361e-b41d-4b10-ae8c-d85aa833faaa" (UID: "0f94361e-b41d-4b10-ae8c-d85aa833faaa"). InnerVolumeSpecName "kube-api-access-k7vdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.587737 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f94361e-b41d-4b10-ae8c-d85aa833faaa-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.587773 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/0f94361e-b41d-4b10-ae8c-d85aa833faaa-kube-api-access-k7vdz\") on node \"crc\" DevicePath \"\"" Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.966067 4941 generic.go:334] "Generic (PLEG): container finished" podID="0f94361e-b41d-4b10-ae8c-d85aa833faaa" containerID="d604ad7dde3dcdc7ff3808de49fdac469a23376d4960041051e045c21b5020b2" exitCode=0 Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.966178 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6t4x" event={"ID":"0f94361e-b41d-4b10-ae8c-d85aa833faaa","Type":"ContainerDied","Data":"d604ad7dde3dcdc7ff3808de49fdac469a23376d4960041051e045c21b5020b2"} Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.966191 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z6t4x" Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.966290 4941 scope.go:117] "RemoveContainer" containerID="d604ad7dde3dcdc7ff3808de49fdac469a23376d4960041051e045c21b5020b2" Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.966276 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6t4x" event={"ID":"0f94361e-b41d-4b10-ae8c-d85aa833faaa","Type":"ContainerDied","Data":"10bd4513f803e2f95e7e0b8eccb681de93d9aadc7b50baf68e243df04f552a53"} Mar 07 07:49:18 crc kubenswrapper[4941]: I0307 07:49:18.994500 4941 scope.go:117] "RemoveContainer" containerID="94b983e366ddb474cbb0f9ae6051a4f4d8cbdfff4630e77b8ce2b8345ad7a7d7" Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.026585 4941 scope.go:117] "RemoveContainer" containerID="fcca9beb073cd77b31da16825f7eab4c2a658dd0d40ccc3784bd35009ec48d51" Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.062356 4941 scope.go:117] "RemoveContainer" containerID="d604ad7dde3dcdc7ff3808de49fdac469a23376d4960041051e045c21b5020b2" Mar 07 07:49:19 crc kubenswrapper[4941]: E0307 07:49:19.062858 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d604ad7dde3dcdc7ff3808de49fdac469a23376d4960041051e045c21b5020b2\": container with ID starting with d604ad7dde3dcdc7ff3808de49fdac469a23376d4960041051e045c21b5020b2 not found: ID does not exist" containerID="d604ad7dde3dcdc7ff3808de49fdac469a23376d4960041051e045c21b5020b2" Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.062905 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d604ad7dde3dcdc7ff3808de49fdac469a23376d4960041051e045c21b5020b2"} err="failed to get container status \"d604ad7dde3dcdc7ff3808de49fdac469a23376d4960041051e045c21b5020b2\": rpc error: code = NotFound desc = could not find container \"d604ad7dde3dcdc7ff3808de49fdac469a23376d4960041051e045c21b5020b2\": container with ID starting with d604ad7dde3dcdc7ff3808de49fdac469a23376d4960041051e045c21b5020b2 not found: ID does not exist" Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.062937 4941 scope.go:117] "RemoveContainer" containerID="94b983e366ddb474cbb0f9ae6051a4f4d8cbdfff4630e77b8ce2b8345ad7a7d7" Mar 07 07:49:19 crc kubenswrapper[4941]: E0307 07:49:19.063843 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b983e366ddb474cbb0f9ae6051a4f4d8cbdfff4630e77b8ce2b8345ad7a7d7\": container with ID starting with 94b983e366ddb474cbb0f9ae6051a4f4d8cbdfff4630e77b8ce2b8345ad7a7d7 not found: ID does not exist" containerID="94b983e366ddb474cbb0f9ae6051a4f4d8cbdfff4630e77b8ce2b8345ad7a7d7" Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.063882 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b983e366ddb474cbb0f9ae6051a4f4d8cbdfff4630e77b8ce2b8345ad7a7d7"} err="failed to get container status \"94b983e366ddb474cbb0f9ae6051a4f4d8cbdfff4630e77b8ce2b8345ad7a7d7\": rpc error: code = NotFound desc = could not find container \"94b983e366ddb474cbb0f9ae6051a4f4d8cbdfff4630e77b8ce2b8345ad7a7d7\": container with ID starting with 94b983e366ddb474cbb0f9ae6051a4f4d8cbdfff4630e77b8ce2b8345ad7a7d7 not found: ID does not exist" Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.063935 4941 scope.go:117] "RemoveContainer" containerID="fcca9beb073cd77b31da16825f7eab4c2a658dd0d40ccc3784bd35009ec48d51" Mar 07 07:49:19 crc kubenswrapper[4941]: E0307 07:49:19.064277 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcca9beb073cd77b31da16825f7eab4c2a658dd0d40ccc3784bd35009ec48d51\": container with ID starting with fcca9beb073cd77b31da16825f7eab4c2a658dd0d40ccc3784bd35009ec48d51 not found: ID does not exist" containerID="fcca9beb073cd77b31da16825f7eab4c2a658dd0d40ccc3784bd35009ec48d51" Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.064302 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcca9beb073cd77b31da16825f7eab4c2a658dd0d40ccc3784bd35009ec48d51"} err="failed to get container status \"fcca9beb073cd77b31da16825f7eab4c2a658dd0d40ccc3784bd35009ec48d51\": rpc error: code = NotFound desc = could not find container \"fcca9beb073cd77b31da16825f7eab4c2a658dd0d40ccc3784bd35009ec48d51\": container with ID starting with fcca9beb073cd77b31da16825f7eab4c2a658dd0d40ccc3784bd35009ec48d51 not found: ID does not exist" Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.218587 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f94361e-b41d-4b10-ae8c-d85aa833faaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f94361e-b41d-4b10-ae8c-d85aa833faaa" (UID: "0f94361e-b41d-4b10-ae8c-d85aa833faaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.297095 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z6t4x"] Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.297911 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f94361e-b41d-4b10-ae8c-d85aa833faaa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.303037 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z6t4x"] Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.967732 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f94361e-b41d-4b10-ae8c-d85aa833faaa" path="/var/lib/kubelet/pods/0f94361e-b41d-4b10-ae8c-d85aa833faaa/volumes" Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.968768 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:19 crc kubenswrapper[4941]: I0307 07:49:19.968817 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:20 crc kubenswrapper[4941]: I0307 07:49:20.014681 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:20 crc kubenswrapper[4941]: I0307 07:49:20.976046 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:20 crc kubenswrapper[4941]: I0307 07:49:20.976428 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:21 crc kubenswrapper[4941]: I0307 07:49:21.045645 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:22 crc kubenswrapper[4941]: I0307 07:49:22.033338 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s8jzs" podUID="7b319784-3b37-4ffa-b81b-5db086387b49" containerName="registry-server" probeResult="failure" output=< Mar 07 07:49:22 crc kubenswrapper[4941]: timeout: failed to connect service ":50051" within 1s Mar 07 07:49:22 crc kubenswrapper[4941]: > Mar 07 07:49:22 crc kubenswrapper[4941]: I0307 07:49:22.216742 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2jrfp"] Mar 07 07:49:22 crc kubenswrapper[4941]: I0307 07:49:22.216998 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2jrfp" podUID="1800a89b-3818-464c-a992-ca10ae3f6a43" containerName="registry-server" containerID="cri-o://2e4308794c1d3e73bc45dc7aa2036c40aaade667cfff6b76a0e5783962663890" gracePeriod=2 Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.024347 4941 generic.go:334] "Generic (PLEG): container finished" podID="1800a89b-3818-464c-a992-ca10ae3f6a43" containerID="2e4308794c1d3e73bc45dc7aa2036c40aaade667cfff6b76a0e5783962663890" exitCode=0 Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.024471 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jrfp" event={"ID":"1800a89b-3818-464c-a992-ca10ae3f6a43","Type":"ContainerDied","Data":"2e4308794c1d3e73bc45dc7aa2036c40aaade667cfff6b76a0e5783962663890"} Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.030997 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vjtdg"] Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.031492 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vjtdg" podUID="c5e29523-87f6-4424-a7da-0b2e62ec857f" containerName="registry-server" containerID="cri-o://61f9089709db6a3ebf1811bebdbe8cba513bf00fc02399f0daef05caaf407be7" gracePeriod=2 Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.462876 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.581771 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5e29523-87f6-4424-a7da-0b2e62ec857f-catalog-content\") pod \"c5e29523-87f6-4424-a7da-0b2e62ec857f\" (UID: \"c5e29523-87f6-4424-a7da-0b2e62ec857f\") " Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.581830 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84txq\" (UniqueName: \"kubernetes.io/projected/c5e29523-87f6-4424-a7da-0b2e62ec857f-kube-api-access-84txq\") pod \"c5e29523-87f6-4424-a7da-0b2e62ec857f\" (UID: \"c5e29523-87f6-4424-a7da-0b2e62ec857f\") " Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.581922 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5e29523-87f6-4424-a7da-0b2e62ec857f-utilities\") pod \"c5e29523-87f6-4424-a7da-0b2e62ec857f\" (UID: \"c5e29523-87f6-4424-a7da-0b2e62ec857f\") " Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.584531 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5e29523-87f6-4424-a7da-0b2e62ec857f-utilities" (OuterVolumeSpecName: "utilities") pod "c5e29523-87f6-4424-a7da-0b2e62ec857f" (UID: "c5e29523-87f6-4424-a7da-0b2e62ec857f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.587766 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e29523-87f6-4424-a7da-0b2e62ec857f-kube-api-access-84txq" (OuterVolumeSpecName: "kube-api-access-84txq") pod "c5e29523-87f6-4424-a7da-0b2e62ec857f" (UID: "c5e29523-87f6-4424-a7da-0b2e62ec857f"). InnerVolumeSpecName "kube-api-access-84txq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.615346 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5e29523-87f6-4424-a7da-0b2e62ec857f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5e29523-87f6-4424-a7da-0b2e62ec857f" (UID: "c5e29523-87f6-4424-a7da-0b2e62ec857f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.684425 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5e29523-87f6-4424-a7da-0b2e62ec857f-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.684480 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5e29523-87f6-4424-a7da-0b2e62ec857f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:49:24 crc kubenswrapper[4941]: I0307 07:49:24.684495 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84txq\" (UniqueName: \"kubernetes.io/projected/c5e29523-87f6-4424-a7da-0b2e62ec857f-kube-api-access-84txq\") on node \"crc\" DevicePath \"\"" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.037385 4941 generic.go:334] "Generic (PLEG): container finished" podID="c5e29523-87f6-4424-a7da-0b2e62ec857f" containerID="61f9089709db6a3ebf1811bebdbe8cba513bf00fc02399f0daef05caaf407be7" exitCode=0 Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.037459 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vjtdg" event={"ID":"c5e29523-87f6-4424-a7da-0b2e62ec857f","Type":"ContainerDied","Data":"61f9089709db6a3ebf1811bebdbe8cba513bf00fc02399f0daef05caaf407be7"} Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.037504 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vjtdg" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.037533 4941 scope.go:117] "RemoveContainer" containerID="61f9089709db6a3ebf1811bebdbe8cba513bf00fc02399f0daef05caaf407be7" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.037519 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vjtdg" event={"ID":"c5e29523-87f6-4424-a7da-0b2e62ec857f","Type":"ContainerDied","Data":"8e74c083782f197962b3fa83487dbe3dd4b1e8ae70520be7d438d7ee9008bef4"} Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.068109 4941 scope.go:117] "RemoveContainer" containerID="9d54afbcefc23e1a1df4649f4a5017e8385e62dd74589e4a5d50e2c58aeb07ec" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.092144 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vjtdg"] Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.104653 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vjtdg"] Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.118930 4941 scope.go:117] "RemoveContainer" containerID="5c26e97a4474f6459bf9966586255c34e25d41d01ccd64f05a96c68555b6c7e3" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.142912 4941 scope.go:117] "RemoveContainer" containerID="61f9089709db6a3ebf1811bebdbe8cba513bf00fc02399f0daef05caaf407be7" Mar 07 07:49:25 crc kubenswrapper[4941]: E0307 07:49:25.143455 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f9089709db6a3ebf1811bebdbe8cba513bf00fc02399f0daef05caaf407be7\": container with ID starting with 61f9089709db6a3ebf1811bebdbe8cba513bf00fc02399f0daef05caaf407be7 not found: ID does not exist" containerID="61f9089709db6a3ebf1811bebdbe8cba513bf00fc02399f0daef05caaf407be7" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.143512 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f9089709db6a3ebf1811bebdbe8cba513bf00fc02399f0daef05caaf407be7"} err="failed to get container status \"61f9089709db6a3ebf1811bebdbe8cba513bf00fc02399f0daef05caaf407be7\": rpc error: code = NotFound desc = could not find container \"61f9089709db6a3ebf1811bebdbe8cba513bf00fc02399f0daef05caaf407be7\": container with ID starting with 61f9089709db6a3ebf1811bebdbe8cba513bf00fc02399f0daef05caaf407be7 not found: ID does not exist" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.143551 4941 scope.go:117] "RemoveContainer" containerID="9d54afbcefc23e1a1df4649f4a5017e8385e62dd74589e4a5d50e2c58aeb07ec" Mar 07 07:49:25 crc kubenswrapper[4941]: E0307 07:49:25.144005 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d54afbcefc23e1a1df4649f4a5017e8385e62dd74589e4a5d50e2c58aeb07ec\": container with ID starting with 9d54afbcefc23e1a1df4649f4a5017e8385e62dd74589e4a5d50e2c58aeb07ec not found: ID does not exist" containerID="9d54afbcefc23e1a1df4649f4a5017e8385e62dd74589e4a5d50e2c58aeb07ec" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.144057 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d54afbcefc23e1a1df4649f4a5017e8385e62dd74589e4a5d50e2c58aeb07ec"} err="failed to get container status \"9d54afbcefc23e1a1df4649f4a5017e8385e62dd74589e4a5d50e2c58aeb07ec\": rpc error: code = NotFound desc = could not find container \"9d54afbcefc23e1a1df4649f4a5017e8385e62dd74589e4a5d50e2c58aeb07ec\": container with ID starting with 9d54afbcefc23e1a1df4649f4a5017e8385e62dd74589e4a5d50e2c58aeb07ec not found: ID does not exist" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.144092 4941 scope.go:117] "RemoveContainer" containerID="5c26e97a4474f6459bf9966586255c34e25d41d01ccd64f05a96c68555b6c7e3" Mar 07 07:49:25 crc kubenswrapper[4941]: E0307 07:49:25.144475 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c26e97a4474f6459bf9966586255c34e25d41d01ccd64f05a96c68555b6c7e3\": container with ID starting with 5c26e97a4474f6459bf9966586255c34e25d41d01ccd64f05a96c68555b6c7e3 not found: ID does not exist" containerID="5c26e97a4474f6459bf9966586255c34e25d41d01ccd64f05a96c68555b6c7e3" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.144526 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c26e97a4474f6459bf9966586255c34e25d41d01ccd64f05a96c68555b6c7e3"} err="failed to get container status \"5c26e97a4474f6459bf9966586255c34e25d41d01ccd64f05a96c68555b6c7e3\": rpc error: code = NotFound desc = could not find container \"5c26e97a4474f6459bf9966586255c34e25d41d01ccd64f05a96c68555b6c7e3\": container with ID starting with 5c26e97a4474f6459bf9966586255c34e25d41d01ccd64f05a96c68555b6c7e3 not found: ID does not exist" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.449523 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.599028 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmw7k\" (UniqueName: \"kubernetes.io/projected/1800a89b-3818-464c-a992-ca10ae3f6a43-kube-api-access-lmw7k\") pod \"1800a89b-3818-464c-a992-ca10ae3f6a43\" (UID: \"1800a89b-3818-464c-a992-ca10ae3f6a43\") " Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.599354 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1800a89b-3818-464c-a992-ca10ae3f6a43-utilities\") pod \"1800a89b-3818-464c-a992-ca10ae3f6a43\" (UID: \"1800a89b-3818-464c-a992-ca10ae3f6a43\") " Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.599540 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1800a89b-3818-464c-a992-ca10ae3f6a43-catalog-content\") pod \"1800a89b-3818-464c-a992-ca10ae3f6a43\" (UID: \"1800a89b-3818-464c-a992-ca10ae3f6a43\") " Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.600060 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1800a89b-3818-464c-a992-ca10ae3f6a43-utilities" (OuterVolumeSpecName: "utilities") pod "1800a89b-3818-464c-a992-ca10ae3f6a43" (UID: "1800a89b-3818-464c-a992-ca10ae3f6a43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.603003 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1800a89b-3818-464c-a992-ca10ae3f6a43-kube-api-access-lmw7k" (OuterVolumeSpecName: "kube-api-access-lmw7k") pod "1800a89b-3818-464c-a992-ca10ae3f6a43" (UID: "1800a89b-3818-464c-a992-ca10ae3f6a43"). InnerVolumeSpecName "kube-api-access-lmw7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.663763 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1800a89b-3818-464c-a992-ca10ae3f6a43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1800a89b-3818-464c-a992-ca10ae3f6a43" (UID: "1800a89b-3818-464c-a992-ca10ae3f6a43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.701532 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmw7k\" (UniqueName: \"kubernetes.io/projected/1800a89b-3818-464c-a992-ca10ae3f6a43-kube-api-access-lmw7k\") on node \"crc\" DevicePath \"\"" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.701559 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1800a89b-3818-464c-a992-ca10ae3f6a43-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.701570 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1800a89b-3818-464c-a992-ca10ae3f6a43-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:49:25 crc kubenswrapper[4941]: I0307 07:49:25.962854 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e29523-87f6-4424-a7da-0b2e62ec857f" path="/var/lib/kubelet/pods/c5e29523-87f6-4424-a7da-0b2e62ec857f/volumes" Mar 07 07:49:26 crc kubenswrapper[4941]: I0307 07:49:26.049978 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jrfp" event={"ID":"1800a89b-3818-464c-a992-ca10ae3f6a43","Type":"ContainerDied","Data":"f527c3ab15fd1f9e44036b7e5c9b1e7ef537c0679087fd3755ef2bc1285accc6"} Mar 07 07:49:26 crc kubenswrapper[4941]: I0307 07:49:26.050037 4941 scope.go:117] "RemoveContainer" containerID="2e4308794c1d3e73bc45dc7aa2036c40aaade667cfff6b76a0e5783962663890" Mar 07 07:49:26 crc kubenswrapper[4941]: I0307 07:49:26.050085 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2jrfp" Mar 07 07:49:26 crc kubenswrapper[4941]: I0307 07:49:26.074938 4941 scope.go:117] "RemoveContainer" containerID="c18bfc3a96e0b47aca621b94462367821296623ded4cf5d55409b0109f361f84" Mar 07 07:49:26 crc kubenswrapper[4941]: I0307 07:49:26.085883 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2jrfp"] Mar 07 07:49:26 crc kubenswrapper[4941]: I0307 07:49:26.096688 4941 scope.go:117] "RemoveContainer" containerID="723a039ddd9cd8779e38e8ab6259fe01eaf89bd81af0a691406fc124b9c64027" Mar 07 07:49:26 crc kubenswrapper[4941]: I0307 07:49:26.097724 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2jrfp"] Mar 07 07:49:27 crc kubenswrapper[4941]: I0307 07:49:27.966333 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1800a89b-3818-464c-a992-ca10ae3f6a43" path="/var/lib/kubelet/pods/1800a89b-3818-464c-a992-ca10ae3f6a43/volumes" Mar 07 07:49:31 crc kubenswrapper[4941]: I0307 07:49:31.024154 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:31 crc kubenswrapper[4941]: I0307 07:49:31.065675 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:31 crc kubenswrapper[4941]: I0307 07:49:31.257017 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s8jzs"] Mar 07 07:49:32 crc kubenswrapper[4941]: I0307 07:49:32.118047 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s8jzs" podUID="7b319784-3b37-4ffa-b81b-5db086387b49" containerName="registry-server" containerID="cri-o://a7b7e73f48c4534af2020d0c2132af8624f3fb57fd37d9090ec77cb93c70e35c" gracePeriod=2 Mar 07 07:49:32 crc kubenswrapper[4941]: I0307 07:49:32.613714 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:32 crc kubenswrapper[4941]: I0307 07:49:32.741323 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dg2g\" (UniqueName: \"kubernetes.io/projected/7b319784-3b37-4ffa-b81b-5db086387b49-kube-api-access-6dg2g\") pod \"7b319784-3b37-4ffa-b81b-5db086387b49\" (UID: \"7b319784-3b37-4ffa-b81b-5db086387b49\") " Mar 07 07:49:32 crc kubenswrapper[4941]: I0307 07:49:32.741481 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b319784-3b37-4ffa-b81b-5db086387b49-catalog-content\") pod \"7b319784-3b37-4ffa-b81b-5db086387b49\" (UID: \"7b319784-3b37-4ffa-b81b-5db086387b49\") " Mar 07 07:49:32 crc kubenswrapper[4941]: I0307 07:49:32.741544 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b319784-3b37-4ffa-b81b-5db086387b49-utilities\") pod \"7b319784-3b37-4ffa-b81b-5db086387b49\" (UID: \"7b319784-3b37-4ffa-b81b-5db086387b49\") " Mar 07 07:49:32 crc kubenswrapper[4941]: I0307 07:49:32.742546 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b319784-3b37-4ffa-b81b-5db086387b49-utilities" (OuterVolumeSpecName: "utilities") pod "7b319784-3b37-4ffa-b81b-5db086387b49" (UID: "7b319784-3b37-4ffa-b81b-5db086387b49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:49:32 crc kubenswrapper[4941]: I0307 07:49:32.746990 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b319784-3b37-4ffa-b81b-5db086387b49-kube-api-access-6dg2g" (OuterVolumeSpecName: "kube-api-access-6dg2g") pod "7b319784-3b37-4ffa-b81b-5db086387b49" (UID: "7b319784-3b37-4ffa-b81b-5db086387b49"). InnerVolumeSpecName "kube-api-access-6dg2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:49:32 crc kubenswrapper[4941]: I0307 07:49:32.853143 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b319784-3b37-4ffa-b81b-5db086387b49-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:49:32 crc kubenswrapper[4941]: I0307 07:49:32.853178 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dg2g\" (UniqueName: \"kubernetes.io/projected/7b319784-3b37-4ffa-b81b-5db086387b49-kube-api-access-6dg2g\") on node \"crc\" DevicePath \"\"" Mar 07 07:49:32 crc kubenswrapper[4941]: I0307 07:49:32.917580 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b319784-3b37-4ffa-b81b-5db086387b49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b319784-3b37-4ffa-b81b-5db086387b49" (UID: "7b319784-3b37-4ffa-b81b-5db086387b49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:49:32 crc kubenswrapper[4941]: I0307 07:49:32.954772 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b319784-3b37-4ffa-b81b-5db086387b49-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.126267 4941 generic.go:334] "Generic (PLEG): container finished" podID="7b319784-3b37-4ffa-b81b-5db086387b49" containerID="a7b7e73f48c4534af2020d0c2132af8624f3fb57fd37d9090ec77cb93c70e35c" exitCode=0 Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.126311 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8jzs" Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.126318 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8jzs" event={"ID":"7b319784-3b37-4ffa-b81b-5db086387b49","Type":"ContainerDied","Data":"a7b7e73f48c4534af2020d0c2132af8624f3fb57fd37d9090ec77cb93c70e35c"} Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.126356 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8jzs" event={"ID":"7b319784-3b37-4ffa-b81b-5db086387b49","Type":"ContainerDied","Data":"61d8d86418f68ccdf7f2191c97f983138e8fef6a800da7f0e8de93e8da39d63b"} Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.126377 4941 scope.go:117] "RemoveContainer" containerID="a7b7e73f48c4534af2020d0c2132af8624f3fb57fd37d9090ec77cb93c70e35c" Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.164800 4941 scope.go:117] "RemoveContainer" containerID="c279f290a68976b90bda9f27ae18de373259ab86ba7fd00db47d788a4bd9fb24" Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.166949 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s8jzs"] Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.187665 4941 scope.go:117] "RemoveContainer" containerID="f102fde7f0e466e853405767cc63bd2701538cab4df16c0265bcf197e9f56d3a" Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.188130 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s8jzs"] Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.220069 4941 scope.go:117] "RemoveContainer" containerID="a7b7e73f48c4534af2020d0c2132af8624f3fb57fd37d9090ec77cb93c70e35c" Mar 07 07:49:33 crc kubenswrapper[4941]: E0307 07:49:33.220580 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b7e73f48c4534af2020d0c2132af8624f3fb57fd37d9090ec77cb93c70e35c\": container with ID starting with a7b7e73f48c4534af2020d0c2132af8624f3fb57fd37d9090ec77cb93c70e35c not found: ID does not exist" containerID="a7b7e73f48c4534af2020d0c2132af8624f3fb57fd37d9090ec77cb93c70e35c" Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.220634 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b7e73f48c4534af2020d0c2132af8624f3fb57fd37d9090ec77cb93c70e35c"} err="failed to get container status \"a7b7e73f48c4534af2020d0c2132af8624f3fb57fd37d9090ec77cb93c70e35c\": rpc error: code = NotFound desc = could not find container \"a7b7e73f48c4534af2020d0c2132af8624f3fb57fd37d9090ec77cb93c70e35c\": container with ID starting with a7b7e73f48c4534af2020d0c2132af8624f3fb57fd37d9090ec77cb93c70e35c not found: ID does not exist" Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.220663 4941 scope.go:117] "RemoveContainer" containerID="c279f290a68976b90bda9f27ae18de373259ab86ba7fd00db47d788a4bd9fb24" Mar 07 07:49:33 crc kubenswrapper[4941]: E0307 07:49:33.221064 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c279f290a68976b90bda9f27ae18de373259ab86ba7fd00db47d788a4bd9fb24\": container with ID starting with c279f290a68976b90bda9f27ae18de373259ab86ba7fd00db47d788a4bd9fb24 not found: ID does not exist" containerID="c279f290a68976b90bda9f27ae18de373259ab86ba7fd00db47d788a4bd9fb24" Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.221094 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c279f290a68976b90bda9f27ae18de373259ab86ba7fd00db47d788a4bd9fb24"} err="failed to get container status \"c279f290a68976b90bda9f27ae18de373259ab86ba7fd00db47d788a4bd9fb24\": rpc error: code = NotFound desc = could not find container \"c279f290a68976b90bda9f27ae18de373259ab86ba7fd00db47d788a4bd9fb24\": container with ID starting with c279f290a68976b90bda9f27ae18de373259ab86ba7fd00db47d788a4bd9fb24 not found: ID does not exist" Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.221118 4941 scope.go:117] "RemoveContainer" containerID="f102fde7f0e466e853405767cc63bd2701538cab4df16c0265bcf197e9f56d3a" Mar 07 07:49:33 crc kubenswrapper[4941]: E0307 07:49:33.221489 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f102fde7f0e466e853405767cc63bd2701538cab4df16c0265bcf197e9f56d3a\": container with ID starting with f102fde7f0e466e853405767cc63bd2701538cab4df16c0265bcf197e9f56d3a not found: ID does not exist" containerID="f102fde7f0e466e853405767cc63bd2701538cab4df16c0265bcf197e9f56d3a" Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.221534 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f102fde7f0e466e853405767cc63bd2701538cab4df16c0265bcf197e9f56d3a"} err="failed to get container status \"f102fde7f0e466e853405767cc63bd2701538cab4df16c0265bcf197e9f56d3a\": rpc error: code = NotFound desc = could not find container \"f102fde7f0e466e853405767cc63bd2701538cab4df16c0265bcf197e9f56d3a\": container with ID starting with f102fde7f0e466e853405767cc63bd2701538cab4df16c0265bcf197e9f56d3a not found: ID does not exist" Mar 07 07:49:33 crc kubenswrapper[4941]: I0307 07:49:33.964426 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b319784-3b37-4ffa-b81b-5db086387b49" path="/var/lib/kubelet/pods/7b319784-3b37-4ffa-b81b-5db086387b49/volumes" Mar 07 07:49:40 crc kubenswrapper[4941]: I0307 07:49:40.314009 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:49:40 crc kubenswrapper[4941]: I0307 07:49:40.314538 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:49:40 crc kubenswrapper[4941]: I0307 07:49:40.314588 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 07:49:40 crc kubenswrapper[4941]: I0307 07:49:40.315220 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c6dba3014bf81fa7753ec565ebbd4a7a4058e13da0db9e277e4e84371306df7a"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:49:40 crc kubenswrapper[4941]: I0307 07:49:40.315264 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://c6dba3014bf81fa7753ec565ebbd4a7a4058e13da0db9e277e4e84371306df7a" gracePeriod=600 Mar 07 07:49:41 crc kubenswrapper[4941]: I0307 07:49:41.209224 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="c6dba3014bf81fa7753ec565ebbd4a7a4058e13da0db9e277e4e84371306df7a" exitCode=0 Mar 07 07:49:41 crc kubenswrapper[4941]: I0307 07:49:41.209295 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"c6dba3014bf81fa7753ec565ebbd4a7a4058e13da0db9e277e4e84371306df7a"} Mar 07 07:49:41 crc kubenswrapper[4941]: I0307 07:49:41.209993 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92"} Mar 07 07:49:41 crc kubenswrapper[4941]: I0307 07:49:41.210029 4941 scope.go:117] "RemoveContainer" containerID="d939e8e06f27d8e9252f8451a7b73aae288a8d6132f6c2ef238367e3a17ab070" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.167628 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547830-w4xph"] Mar 07 07:50:00 crc kubenswrapper[4941]: E0307 07:50:00.168667 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b319784-3b37-4ffa-b81b-5db086387b49" containerName="extract-content" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.168695 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b319784-3b37-4ffa-b81b-5db086387b49" containerName="extract-content" Mar 07 07:50:00 crc kubenswrapper[4941]: E0307 07:50:00.168724 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1800a89b-3818-464c-a992-ca10ae3f6a43" containerName="extract-utilities" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.168740 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1800a89b-3818-464c-a992-ca10ae3f6a43" containerName="extract-utilities" Mar 07 07:50:00 crc kubenswrapper[4941]: E0307 07:50:00.168771 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1800a89b-3818-464c-a992-ca10ae3f6a43" containerName="registry-server" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.168788 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1800a89b-3818-464c-a992-ca10ae3f6a43" containerName="registry-server" Mar 07 07:50:00 crc kubenswrapper[4941]: E0307 07:50:00.168818 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f94361e-b41d-4b10-ae8c-d85aa833faaa" containerName="extract-content" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.168832 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f94361e-b41d-4b10-ae8c-d85aa833faaa" containerName="extract-content" Mar 07 07:50:00 crc kubenswrapper[4941]: E0307 07:50:00.168857 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f94361e-b41d-4b10-ae8c-d85aa833faaa" containerName="extract-utilities" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.168871 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f94361e-b41d-4b10-ae8c-d85aa833faaa" containerName="extract-utilities" Mar 07 07:50:00 crc kubenswrapper[4941]: E0307 07:50:00.168894 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e29523-87f6-4424-a7da-0b2e62ec857f" containerName="extract-content" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.168909 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e29523-87f6-4424-a7da-0b2e62ec857f" containerName="extract-content" Mar 07 07:50:00 crc kubenswrapper[4941]: E0307 07:50:00.168925 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1800a89b-3818-464c-a992-ca10ae3f6a43" containerName="extract-content" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.168940 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1800a89b-3818-464c-a992-ca10ae3f6a43" containerName="extract-content" Mar 07 07:50:00 crc kubenswrapper[4941]: E0307 07:50:00.168962 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b319784-3b37-4ffa-b81b-5db086387b49" containerName="registry-server" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.168977 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b319784-3b37-4ffa-b81b-5db086387b49" containerName="registry-server" Mar 07 07:50:00 crc kubenswrapper[4941]: E0307 07:50:00.168993 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b319784-3b37-4ffa-b81b-5db086387b49" containerName="extract-utilities" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.169007 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b319784-3b37-4ffa-b81b-5db086387b49" containerName="extract-utilities" Mar 07 07:50:00 crc kubenswrapper[4941]: E0307 07:50:00.169037 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e29523-87f6-4424-a7da-0b2e62ec857f" containerName="extract-utilities" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.169052 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e29523-87f6-4424-a7da-0b2e62ec857f" containerName="extract-utilities" Mar 07 07:50:00 crc kubenswrapper[4941]: E0307 07:50:00.169078 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f94361e-b41d-4b10-ae8c-d85aa833faaa" containerName="registry-server" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.169097 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f94361e-b41d-4b10-ae8c-d85aa833faaa" containerName="registry-server" Mar 07 07:50:00 crc kubenswrapper[4941]: E0307 07:50:00.169125 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e29523-87f6-4424-a7da-0b2e62ec857f" containerName="registry-server" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.169216 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e29523-87f6-4424-a7da-0b2e62ec857f" containerName="registry-server" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.169753 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="1800a89b-3818-464c-a992-ca10ae3f6a43" containerName="registry-server" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.169789 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e29523-87f6-4424-a7da-0b2e62ec857f" containerName="registry-server" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.169987 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b319784-3b37-4ffa-b81b-5db086387b49" containerName="registry-server" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.170048 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f94361e-b41d-4b10-ae8c-d85aa833faaa" containerName="registry-server" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.170989 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547830-w4xph" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.176666 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.176898 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.177223 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.186536 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547830-w4xph"] Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.275334 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx64j\" (UniqueName: \"kubernetes.io/projected/76edbf80-436a-49c3-b1d1-8209d041e0b5-kube-api-access-rx64j\") pod \"auto-csr-approver-29547830-w4xph\" (UID: \"76edbf80-436a-49c3-b1d1-8209d041e0b5\") " pod="openshift-infra/auto-csr-approver-29547830-w4xph" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.376822 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx64j\" (UniqueName: \"kubernetes.io/projected/76edbf80-436a-49c3-b1d1-8209d041e0b5-kube-api-access-rx64j\") pod \"auto-csr-approver-29547830-w4xph\" (UID: \"76edbf80-436a-49c3-b1d1-8209d041e0b5\") " pod="openshift-infra/auto-csr-approver-29547830-w4xph" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.412959 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx64j\" (UniqueName: \"kubernetes.io/projected/76edbf80-436a-49c3-b1d1-8209d041e0b5-kube-api-access-rx64j\") pod \"auto-csr-approver-29547830-w4xph\" (UID: \"76edbf80-436a-49c3-b1d1-8209d041e0b5\") " pod="openshift-infra/auto-csr-approver-29547830-w4xph" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.504458 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547830-w4xph" Mar 07 07:50:00 crc kubenswrapper[4941]: I0307 07:50:00.961297 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547830-w4xph"] Mar 07 07:50:01 crc kubenswrapper[4941]: I0307 07:50:01.392855 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547830-w4xph" event={"ID":"76edbf80-436a-49c3-b1d1-8209d041e0b5","Type":"ContainerStarted","Data":"903d46fcdd6d2f3d2d0461040055c10e8389107e7bef36525b45a6d4fc7e1564"} Mar 07 07:50:02 crc kubenswrapper[4941]: I0307 07:50:02.402728 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547830-w4xph" event={"ID":"76edbf80-436a-49c3-b1d1-8209d041e0b5","Type":"ContainerStarted","Data":"a89db95ffd9a81d112466b0ffdc0ca0a016fa1e8ffa951ae5e9a5c0903a948b9"} Mar 07 07:50:02 crc kubenswrapper[4941]: I0307 07:50:02.421601 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547830-w4xph" podStartSLOduration=1.473421788 podStartE2EDuration="2.421568591s" podCreationTimestamp="2026-03-07 07:50:00 +0000 UTC" firstStartedPulling="2026-03-07 07:50:00.967973388 +0000 UTC m=+3497.920338893" lastFinishedPulling="2026-03-07 07:50:01.916120191 +0000 UTC m=+3498.868485696" observedRunningTime="2026-03-07 07:50:02.421457718 +0000 UTC m=+3499.373823213" watchObservedRunningTime="2026-03-07 07:50:02.421568591 +0000 UTC m=+3499.373934096" Mar 07 07:50:03 crc kubenswrapper[4941]: I0307 07:50:03.412170 4941 generic.go:334] "Generic (PLEG): container finished" podID="76edbf80-436a-49c3-b1d1-8209d041e0b5" containerID="a89db95ffd9a81d112466b0ffdc0ca0a016fa1e8ffa951ae5e9a5c0903a948b9" exitCode=0 Mar 07 07:50:03 crc kubenswrapper[4941]: I0307 07:50:03.412207 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547830-w4xph" event={"ID":"76edbf80-436a-49c3-b1d1-8209d041e0b5","Type":"ContainerDied","Data":"a89db95ffd9a81d112466b0ffdc0ca0a016fa1e8ffa951ae5e9a5c0903a948b9"} Mar 07 07:50:04 crc kubenswrapper[4941]: I0307 07:50:04.814166 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547830-w4xph" Mar 07 07:50:04 crc kubenswrapper[4941]: I0307 07:50:04.943728 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx64j\" (UniqueName: \"kubernetes.io/projected/76edbf80-436a-49c3-b1d1-8209d041e0b5-kube-api-access-rx64j\") pod \"76edbf80-436a-49c3-b1d1-8209d041e0b5\" (UID: \"76edbf80-436a-49c3-b1d1-8209d041e0b5\") " Mar 07 07:50:04 crc kubenswrapper[4941]: I0307 07:50:04.953216 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76edbf80-436a-49c3-b1d1-8209d041e0b5-kube-api-access-rx64j" (OuterVolumeSpecName: "kube-api-access-rx64j") pod "76edbf80-436a-49c3-b1d1-8209d041e0b5" (UID: "76edbf80-436a-49c3-b1d1-8209d041e0b5"). InnerVolumeSpecName "kube-api-access-rx64j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:50:05 crc kubenswrapper[4941]: I0307 07:50:05.046538 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx64j\" (UniqueName: \"kubernetes.io/projected/76edbf80-436a-49c3-b1d1-8209d041e0b5-kube-api-access-rx64j\") on node \"crc\" DevicePath \"\"" Mar 07 07:50:05 crc kubenswrapper[4941]: I0307 07:50:05.459682 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547830-w4xph" event={"ID":"76edbf80-436a-49c3-b1d1-8209d041e0b5","Type":"ContainerDied","Data":"903d46fcdd6d2f3d2d0461040055c10e8389107e7bef36525b45a6d4fc7e1564"} Mar 07 07:50:05 crc kubenswrapper[4941]: I0307 07:50:05.459757 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="903d46fcdd6d2f3d2d0461040055c10e8389107e7bef36525b45a6d4fc7e1564" Mar 07 07:50:05 crc kubenswrapper[4941]: I0307 07:50:05.459806 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547830-w4xph" Mar 07 07:50:05 crc kubenswrapper[4941]: I0307 07:50:05.505594 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547824-fzvrq"] Mar 07 07:50:05 crc kubenswrapper[4941]: I0307 07:50:05.518947 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547824-fzvrq"] Mar 07 07:50:05 crc kubenswrapper[4941]: I0307 07:50:05.965328 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30cb74da-f324-4715-ae31-1b987e8ef17d" path="/var/lib/kubelet/pods/30cb74da-f324-4715-ae31-1b987e8ef17d/volumes" Mar 07 07:50:09 crc kubenswrapper[4941]: I0307 07:50:09.353523 4941 scope.go:117] "RemoveContainer" containerID="2fefe631df98173aa45252d754ec39191d78b7a1e96c5e8ef00de3499975b869" Mar 07 07:51:40 crc kubenswrapper[4941]: I0307 07:51:40.313783 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:51:40 crc kubenswrapper[4941]: I0307 07:51:40.314374 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:52:00 crc kubenswrapper[4941]: I0307 07:52:00.160749 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547832-nh7dg"] Mar 07 07:52:00 crc kubenswrapper[4941]: E0307 07:52:00.161739 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76edbf80-436a-49c3-b1d1-8209d041e0b5" containerName="oc" Mar 07 07:52:00 crc kubenswrapper[4941]: I0307 07:52:00.161755 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="76edbf80-436a-49c3-b1d1-8209d041e0b5" containerName="oc" Mar 07 07:52:00 crc kubenswrapper[4941]: I0307 07:52:00.161922 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="76edbf80-436a-49c3-b1d1-8209d041e0b5" containerName="oc" Mar 07 07:52:00 crc kubenswrapper[4941]: I0307 07:52:00.162500 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547832-nh7dg" Mar 07 07:52:00 crc kubenswrapper[4941]: I0307 07:52:00.166745 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:52:00 crc kubenswrapper[4941]: I0307 07:52:00.167392 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:52:00 crc kubenswrapper[4941]: I0307 07:52:00.168496 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:52:00 crc kubenswrapper[4941]: I0307 07:52:00.195294 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547832-nh7dg"] Mar 07 07:52:00 crc kubenswrapper[4941]: I0307 07:52:00.349580 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwd8t\" (UniqueName: \"kubernetes.io/projected/7c28b30b-3be6-45e9-b702-6a6d6e1cac86-kube-api-access-fwd8t\") pod \"auto-csr-approver-29547832-nh7dg\" (UID: \"7c28b30b-3be6-45e9-b702-6a6d6e1cac86\") " pod="openshift-infra/auto-csr-approver-29547832-nh7dg" Mar 07 07:52:00 crc kubenswrapper[4941]: I0307 07:52:00.451057 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwd8t\" (UniqueName: \"kubernetes.io/projected/7c28b30b-3be6-45e9-b702-6a6d6e1cac86-kube-api-access-fwd8t\") pod \"auto-csr-approver-29547832-nh7dg\" (UID: \"7c28b30b-3be6-45e9-b702-6a6d6e1cac86\") " pod="openshift-infra/auto-csr-approver-29547832-nh7dg" Mar 07 07:52:00 crc kubenswrapper[4941]: I0307 07:52:00.474328 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwd8t\" (UniqueName: \"kubernetes.io/projected/7c28b30b-3be6-45e9-b702-6a6d6e1cac86-kube-api-access-fwd8t\") pod \"auto-csr-approver-29547832-nh7dg\" (UID: \"7c28b30b-3be6-45e9-b702-6a6d6e1cac86\") " pod="openshift-infra/auto-csr-approver-29547832-nh7dg" Mar 07 07:52:00 crc kubenswrapper[4941]: I0307 07:52:00.492467 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547832-nh7dg" Mar 07 07:52:00 crc kubenswrapper[4941]: I0307 07:52:00.992491 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547832-nh7dg"] Mar 07 07:52:01 crc kubenswrapper[4941]: I0307 07:52:01.337911 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547832-nh7dg" event={"ID":"7c28b30b-3be6-45e9-b702-6a6d6e1cac86","Type":"ContainerStarted","Data":"70c9fc1eeac9242a364e402616451973e101dec8ff3df0b88e5820b39ec35950"} Mar 07 07:52:02 crc kubenswrapper[4941]: I0307 07:52:02.346337 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547832-nh7dg" event={"ID":"7c28b30b-3be6-45e9-b702-6a6d6e1cac86","Type":"ContainerStarted","Data":"2f3ef71d4fa836ebf730ae8deb0958a4deb9e558aa17d122d88a39a28eee2811"} Mar 07 07:52:02 crc kubenswrapper[4941]: I0307 07:52:02.364828 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547832-nh7dg" podStartSLOduration=1.360622568 podStartE2EDuration="2.364806941s" podCreationTimestamp="2026-03-07 07:52:00 +0000 UTC" firstStartedPulling="2026-03-07 07:52:01.006199003 +0000 UTC m=+3617.958564508" lastFinishedPulling="2026-03-07 07:52:02.010383406 +0000 UTC m=+3618.962748881" observedRunningTime="2026-03-07 07:52:02.359255964 +0000 UTC m=+3619.311621429" watchObservedRunningTime="2026-03-07 07:52:02.364806941 +0000 UTC m=+3619.317172406" Mar 07 07:52:03 crc kubenswrapper[4941]: I0307 07:52:03.362990 4941 generic.go:334] "Generic (PLEG): container finished" podID="7c28b30b-3be6-45e9-b702-6a6d6e1cac86" containerID="2f3ef71d4fa836ebf730ae8deb0958a4deb9e558aa17d122d88a39a28eee2811" exitCode=0 Mar 07 07:52:03 crc kubenswrapper[4941]: I0307 07:52:03.363218 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547832-nh7dg" event={"ID":"7c28b30b-3be6-45e9-b702-6a6d6e1cac86","Type":"ContainerDied","Data":"2f3ef71d4fa836ebf730ae8deb0958a4deb9e558aa17d122d88a39a28eee2811"} Mar 07 07:52:04 crc kubenswrapper[4941]: I0307 07:52:04.666936 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547832-nh7dg" Mar 07 07:52:04 crc kubenswrapper[4941]: I0307 07:52:04.736699 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwd8t\" (UniqueName: \"kubernetes.io/projected/7c28b30b-3be6-45e9-b702-6a6d6e1cac86-kube-api-access-fwd8t\") pod \"7c28b30b-3be6-45e9-b702-6a6d6e1cac86\" (UID: \"7c28b30b-3be6-45e9-b702-6a6d6e1cac86\") " Mar 07 07:52:04 crc kubenswrapper[4941]: I0307 07:52:04.745602 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c28b30b-3be6-45e9-b702-6a6d6e1cac86-kube-api-access-fwd8t" (OuterVolumeSpecName: "kube-api-access-fwd8t") pod "7c28b30b-3be6-45e9-b702-6a6d6e1cac86" (UID: "7c28b30b-3be6-45e9-b702-6a6d6e1cac86"). InnerVolumeSpecName "kube-api-access-fwd8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:52:04 crc kubenswrapper[4941]: I0307 07:52:04.837750 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwd8t\" (UniqueName: \"kubernetes.io/projected/7c28b30b-3be6-45e9-b702-6a6d6e1cac86-kube-api-access-fwd8t\") on node \"crc\" DevicePath \"\"" Mar 07 07:52:05 crc kubenswrapper[4941]: I0307 07:52:05.381393 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547832-nh7dg" event={"ID":"7c28b30b-3be6-45e9-b702-6a6d6e1cac86","Type":"ContainerDied","Data":"70c9fc1eeac9242a364e402616451973e101dec8ff3df0b88e5820b39ec35950"} Mar 07 07:52:05 crc kubenswrapper[4941]: I0307 07:52:05.381449 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547832-nh7dg" Mar 07 07:52:05 crc kubenswrapper[4941]: I0307 07:52:05.381452 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70c9fc1eeac9242a364e402616451973e101dec8ff3df0b88e5820b39ec35950" Mar 07 07:52:05 crc kubenswrapper[4941]: I0307 07:52:05.444365 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547826-xnxbb"] Mar 07 07:52:05 crc kubenswrapper[4941]: I0307 07:52:05.451077 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547826-xnxbb"] Mar 07 07:52:05 crc kubenswrapper[4941]: I0307 07:52:05.965986 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85849ea0-d0b0-4911-a00f-53ac4b58d95c" path="/var/lib/kubelet/pods/85849ea0-d0b0-4911-a00f-53ac4b58d95c/volumes" Mar 07 07:52:09 crc kubenswrapper[4941]: I0307 07:52:09.518677 4941 scope.go:117] "RemoveContainer" containerID="6cb1618a65291283f355f148744de26d668facf92d8725ff47caec1d3e5c2582" Mar 07 07:52:10 crc kubenswrapper[4941]: I0307 07:52:10.314194 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:52:10 crc kubenswrapper[4941]: I0307 07:52:10.314476 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:52:40 crc kubenswrapper[4941]: I0307 07:52:40.314559 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:52:40 crc kubenswrapper[4941]: I0307 07:52:40.315188 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:52:40 crc kubenswrapper[4941]: I0307 07:52:40.315252 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 07:52:40 crc kubenswrapper[4941]: I0307 07:52:40.315968 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:52:40 crc kubenswrapper[4941]: I0307 07:52:40.316049 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" gracePeriod=600 Mar 07 07:52:40 crc kubenswrapper[4941]: I0307 07:52:40.674418 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" exitCode=0 Mar 07 07:52:40 crc kubenswrapper[4941]: I0307 07:52:40.674443 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92"} Mar 07 07:52:40 crc kubenswrapper[4941]: I0307 07:52:40.674782 4941 scope.go:117] "RemoveContainer" containerID="c6dba3014bf81fa7753ec565ebbd4a7a4058e13da0db9e277e4e84371306df7a" Mar 07 07:52:40 crc kubenswrapper[4941]: E0307 07:52:40.966102 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:52:41 crc kubenswrapper[4941]: I0307 07:52:41.686567 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:52:41 crc kubenswrapper[4941]: E0307 07:52:41.686818 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:52:52 crc kubenswrapper[4941]: I0307 07:52:52.955091 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:52:52 crc kubenswrapper[4941]: E0307 07:52:52.956050 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:53:07 crc kubenswrapper[4941]: I0307 07:53:07.954359 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:53:07 crc kubenswrapper[4941]: E0307 07:53:07.955090 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:53:20 crc kubenswrapper[4941]: I0307 07:53:20.955224 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:53:20 crc kubenswrapper[4941]: E0307 07:53:20.956482 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:53:34 crc kubenswrapper[4941]: I0307 07:53:34.954680 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:53:34 crc kubenswrapper[4941]: E0307 07:53:34.955655 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:53:45 crc kubenswrapper[4941]: I0307 07:53:45.954077 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:53:45 crc kubenswrapper[4941]: E0307 07:53:45.954924 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:53:59 crc kubenswrapper[4941]: I0307 07:53:59.955144 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:53:59 crc kubenswrapper[4941]: E0307 07:53:59.956244 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.161994 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547834-glmmb"] Mar 07 07:54:00 crc kubenswrapper[4941]: E0307 07:54:00.162450 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c28b30b-3be6-45e9-b702-6a6d6e1cac86" containerName="oc" Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.162470 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c28b30b-3be6-45e9-b702-6a6d6e1cac86" containerName="oc" Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.162755 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c28b30b-3be6-45e9-b702-6a6d6e1cac86" containerName="oc" Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.163442 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547834-glmmb" Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.165443 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.166008 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.166732 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.183324 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547834-glmmb"] Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.285129 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qfb6\" (UniqueName: \"kubernetes.io/projected/8610f6c7-0c60-4c3e-be92-c2c75caa6eb5-kube-api-access-2qfb6\") pod \"auto-csr-approver-29547834-glmmb\" (UID: \"8610f6c7-0c60-4c3e-be92-c2c75caa6eb5\") " pod="openshift-infra/auto-csr-approver-29547834-glmmb" Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.387131 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qfb6\" (UniqueName: \"kubernetes.io/projected/8610f6c7-0c60-4c3e-be92-c2c75caa6eb5-kube-api-access-2qfb6\") pod \"auto-csr-approver-29547834-glmmb\" (UID: \"8610f6c7-0c60-4c3e-be92-c2c75caa6eb5\") " pod="openshift-infra/auto-csr-approver-29547834-glmmb" Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.414064 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qfb6\" (UniqueName: \"kubernetes.io/projected/8610f6c7-0c60-4c3e-be92-c2c75caa6eb5-kube-api-access-2qfb6\") pod \"auto-csr-approver-29547834-glmmb\" (UID: \"8610f6c7-0c60-4c3e-be92-c2c75caa6eb5\") " pod="openshift-infra/auto-csr-approver-29547834-glmmb" Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.497550 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547834-glmmb" Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.988027 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547834-glmmb"] Mar 07 07:54:00 crc kubenswrapper[4941]: I0307 07:54:00.997367 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:54:01 crc kubenswrapper[4941]: I0307 07:54:01.365904 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547834-glmmb" event={"ID":"8610f6c7-0c60-4c3e-be92-c2c75caa6eb5","Type":"ContainerStarted","Data":"7c2259e148d042feada067974f45941a91a09a5fb73a09d8e7defbff878308ca"} Mar 07 07:54:02 crc kubenswrapper[4941]: I0307 07:54:02.376885 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547834-glmmb" event={"ID":"8610f6c7-0c60-4c3e-be92-c2c75caa6eb5","Type":"ContainerStarted","Data":"f985600e0206a94dcd2ea6d2d74f098ea465c83af3a9dbe0fc5e496322e954aa"} Mar 07 07:54:02 crc kubenswrapper[4941]: I0307 07:54:02.396686 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547834-glmmb" podStartSLOduration=1.5428192040000002 podStartE2EDuration="2.396659431s" podCreationTimestamp="2026-03-07 07:54:00 +0000 UTC" firstStartedPulling="2026-03-07 07:54:00.997093166 +0000 UTC m=+3737.949458631" lastFinishedPulling="2026-03-07 07:54:01.850933393 +0000 UTC m=+3738.803298858" observedRunningTime="2026-03-07 07:54:02.389951915 +0000 UTC m=+3739.342317370" watchObservedRunningTime="2026-03-07 07:54:02.396659431 +0000 UTC m=+3739.349024886" Mar 07 07:54:03 crc kubenswrapper[4941]: I0307 07:54:03.391022 4941 generic.go:334] "Generic (PLEG): container finished" podID="8610f6c7-0c60-4c3e-be92-c2c75caa6eb5" containerID="f985600e0206a94dcd2ea6d2d74f098ea465c83af3a9dbe0fc5e496322e954aa" exitCode=0 Mar 07 07:54:03 crc kubenswrapper[4941]: I0307 07:54:03.391160 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547834-glmmb" event={"ID":"8610f6c7-0c60-4c3e-be92-c2c75caa6eb5","Type":"ContainerDied","Data":"f985600e0206a94dcd2ea6d2d74f098ea465c83af3a9dbe0fc5e496322e954aa"} Mar 07 07:54:04 crc kubenswrapper[4941]: I0307 07:54:04.684201 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547834-glmmb" Mar 07 07:54:04 crc kubenswrapper[4941]: I0307 07:54:04.755078 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qfb6\" (UniqueName: \"kubernetes.io/projected/8610f6c7-0c60-4c3e-be92-c2c75caa6eb5-kube-api-access-2qfb6\") pod \"8610f6c7-0c60-4c3e-be92-c2c75caa6eb5\" (UID: \"8610f6c7-0c60-4c3e-be92-c2c75caa6eb5\") " Mar 07 07:54:04 crc kubenswrapper[4941]: I0307 07:54:04.762830 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8610f6c7-0c60-4c3e-be92-c2c75caa6eb5-kube-api-access-2qfb6" (OuterVolumeSpecName: "kube-api-access-2qfb6") pod "8610f6c7-0c60-4c3e-be92-c2c75caa6eb5" (UID: "8610f6c7-0c60-4c3e-be92-c2c75caa6eb5"). InnerVolumeSpecName "kube-api-access-2qfb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:54:04 crc kubenswrapper[4941]: I0307 07:54:04.857135 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qfb6\" (UniqueName: \"kubernetes.io/projected/8610f6c7-0c60-4c3e-be92-c2c75caa6eb5-kube-api-access-2qfb6\") on node \"crc\" DevicePath \"\"" Mar 07 07:54:05 crc kubenswrapper[4941]: I0307 07:54:05.409611 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547834-glmmb" event={"ID":"8610f6c7-0c60-4c3e-be92-c2c75caa6eb5","Type":"ContainerDied","Data":"7c2259e148d042feada067974f45941a91a09a5fb73a09d8e7defbff878308ca"} Mar 07 07:54:05 crc kubenswrapper[4941]: I0307 07:54:05.409650 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c2259e148d042feada067974f45941a91a09a5fb73a09d8e7defbff878308ca" Mar 07 07:54:05 crc kubenswrapper[4941]: I0307 07:54:05.409732 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547834-glmmb" Mar 07 07:54:05 crc kubenswrapper[4941]: I0307 07:54:05.468300 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547828-899jh"] Mar 07 07:54:05 crc kubenswrapper[4941]: I0307 07:54:05.474633 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547828-899jh"] Mar 07 07:54:05 crc kubenswrapper[4941]: I0307 07:54:05.961592 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb" path="/var/lib/kubelet/pods/a6bbbb83-aa01-4c10-8ef6-c0dc29f0f4cb/volumes" Mar 07 07:54:09 crc kubenswrapper[4941]: I0307 07:54:09.584577 4941 scope.go:117] "RemoveContainer" containerID="c912267d31265560e3a721d06f09b02eac33acc64eda213e9bac2a6ceceda0ef" Mar 07 07:54:10 crc kubenswrapper[4941]: I0307 07:54:10.955110 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:54:10 crc kubenswrapper[4941]: E0307 07:54:10.955518 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:54:22 crc kubenswrapper[4941]: I0307 07:54:22.955705 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:54:22 crc kubenswrapper[4941]: E0307 07:54:22.956882 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:54:33 crc kubenswrapper[4941]: I0307 07:54:33.959682 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:54:33 crc kubenswrapper[4941]: E0307 07:54:33.960343 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:54:46 crc kubenswrapper[4941]: I0307 07:54:46.954537 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:54:46 crc kubenswrapper[4941]: E0307 07:54:46.955342 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:54:58 crc kubenswrapper[4941]: I0307 07:54:58.955659 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:54:58 crc kubenswrapper[4941]: E0307 07:54:58.956553 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:55:12 crc kubenswrapper[4941]: I0307 07:55:12.955233 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:55:12 crc kubenswrapper[4941]: E0307 07:55:12.956376 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:55:24 crc kubenswrapper[4941]: I0307 07:55:24.955073 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:55:24 crc kubenswrapper[4941]: E0307 07:55:24.955954 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:55:36 crc kubenswrapper[4941]: I0307 07:55:36.954521 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:55:36 crc kubenswrapper[4941]: E0307 07:55:36.956484 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:55:51 crc kubenswrapper[4941]: I0307 07:55:51.955280 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:55:51 crc kubenswrapper[4941]: E0307 07:55:51.956112 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:56:00 crc kubenswrapper[4941]: I0307 07:56:00.183398 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547836-v2xgb"] Mar 07 07:56:00 crc kubenswrapper[4941]: E0307 07:56:00.184569 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8610f6c7-0c60-4c3e-be92-c2c75caa6eb5" containerName="oc" Mar 07 07:56:00 crc kubenswrapper[4941]: I0307 07:56:00.184594 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="8610f6c7-0c60-4c3e-be92-c2c75caa6eb5" containerName="oc" Mar 07 07:56:00 crc kubenswrapper[4941]: I0307 07:56:00.184880 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="8610f6c7-0c60-4c3e-be92-c2c75caa6eb5" containerName="oc" Mar 07 07:56:00 crc kubenswrapper[4941]: I0307 07:56:00.185747 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547836-v2xgb" Mar 07 07:56:00 crc kubenswrapper[4941]: I0307 07:56:00.188947 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:56:00 crc kubenswrapper[4941]: I0307 07:56:00.189579 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:56:00 crc kubenswrapper[4941]: I0307 07:56:00.190491 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t57j2\" (UniqueName: \"kubernetes.io/projected/11eb1bfb-3396-4b0d-a07a-55ff9946e2ed-kube-api-access-t57j2\") pod \"auto-csr-approver-29547836-v2xgb\" (UID: \"11eb1bfb-3396-4b0d-a07a-55ff9946e2ed\") " pod="openshift-infra/auto-csr-approver-29547836-v2xgb" Mar 07 07:56:00 crc kubenswrapper[4941]: I0307 07:56:00.192990 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:56:00 crc kubenswrapper[4941]: I0307 07:56:00.199680 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547836-v2xgb"] Mar 07 07:56:00 crc kubenswrapper[4941]: I0307 07:56:00.291932 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t57j2\" (UniqueName: \"kubernetes.io/projected/11eb1bfb-3396-4b0d-a07a-55ff9946e2ed-kube-api-access-t57j2\") pod \"auto-csr-approver-29547836-v2xgb\" (UID: \"11eb1bfb-3396-4b0d-a07a-55ff9946e2ed\") " pod="openshift-infra/auto-csr-approver-29547836-v2xgb" Mar 07 07:56:00 crc kubenswrapper[4941]: I0307 07:56:00.316871 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t57j2\" (UniqueName: \"kubernetes.io/projected/11eb1bfb-3396-4b0d-a07a-55ff9946e2ed-kube-api-access-t57j2\") pod \"auto-csr-approver-29547836-v2xgb\" (UID: \"11eb1bfb-3396-4b0d-a07a-55ff9946e2ed\") " pod="openshift-infra/auto-csr-approver-29547836-v2xgb" Mar 07 07:56:00 crc kubenswrapper[4941]: I0307 07:56:00.513230 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547836-v2xgb" Mar 07 07:56:00 crc kubenswrapper[4941]: I0307 07:56:00.760990 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547836-v2xgb"] Mar 07 07:56:00 crc kubenswrapper[4941]: W0307 07:56:00.770369 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11eb1bfb_3396_4b0d_a07a_55ff9946e2ed.slice/crio-c481876318225882475091efcde7ad89e27b60e24925519348bb2e6ffe2f25e5 WatchSource:0}: Error finding container c481876318225882475091efcde7ad89e27b60e24925519348bb2e6ffe2f25e5: Status 404 returned error can't find the container with id c481876318225882475091efcde7ad89e27b60e24925519348bb2e6ffe2f25e5 Mar 07 07:56:01 crc kubenswrapper[4941]: I0307 07:56:01.424533 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547836-v2xgb" event={"ID":"11eb1bfb-3396-4b0d-a07a-55ff9946e2ed","Type":"ContainerStarted","Data":"c481876318225882475091efcde7ad89e27b60e24925519348bb2e6ffe2f25e5"} Mar 07 07:56:02 crc kubenswrapper[4941]: I0307 07:56:02.433036 4941 generic.go:334] "Generic (PLEG): container finished" podID="11eb1bfb-3396-4b0d-a07a-55ff9946e2ed" containerID="eb3e73d76f1ebd9c5862cca54e476593592fbf0cba2a9406c254257e887179c3" exitCode=0 Mar 07 07:56:02 crc kubenswrapper[4941]: I0307 07:56:02.433222 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547836-v2xgb" event={"ID":"11eb1bfb-3396-4b0d-a07a-55ff9946e2ed","Type":"ContainerDied","Data":"eb3e73d76f1ebd9c5862cca54e476593592fbf0cba2a9406c254257e887179c3"} Mar 07 07:56:02 crc kubenswrapper[4941]: I0307 07:56:02.954455 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:56:02 crc kubenswrapper[4941]: E0307 07:56:02.954793 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:56:03 crc kubenswrapper[4941]: I0307 07:56:03.706102 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547836-v2xgb" Mar 07 07:56:03 crc kubenswrapper[4941]: I0307 07:56:03.740701 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t57j2\" (UniqueName: \"kubernetes.io/projected/11eb1bfb-3396-4b0d-a07a-55ff9946e2ed-kube-api-access-t57j2\") pod \"11eb1bfb-3396-4b0d-a07a-55ff9946e2ed\" (UID: \"11eb1bfb-3396-4b0d-a07a-55ff9946e2ed\") " Mar 07 07:56:03 crc kubenswrapper[4941]: I0307 07:56:03.748896 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11eb1bfb-3396-4b0d-a07a-55ff9946e2ed-kube-api-access-t57j2" (OuterVolumeSpecName: "kube-api-access-t57j2") pod "11eb1bfb-3396-4b0d-a07a-55ff9946e2ed" (UID: "11eb1bfb-3396-4b0d-a07a-55ff9946e2ed"). InnerVolumeSpecName "kube-api-access-t57j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:56:03 crc kubenswrapper[4941]: I0307 07:56:03.842053 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t57j2\" (UniqueName: \"kubernetes.io/projected/11eb1bfb-3396-4b0d-a07a-55ff9946e2ed-kube-api-access-t57j2\") on node \"crc\" DevicePath \"\"" Mar 07 07:56:04 crc kubenswrapper[4941]: I0307 07:56:04.457716 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547836-v2xgb" event={"ID":"11eb1bfb-3396-4b0d-a07a-55ff9946e2ed","Type":"ContainerDied","Data":"c481876318225882475091efcde7ad89e27b60e24925519348bb2e6ffe2f25e5"} Mar 07 07:56:04 crc kubenswrapper[4941]: I0307 07:56:04.457784 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c481876318225882475091efcde7ad89e27b60e24925519348bb2e6ffe2f25e5" Mar 07 07:56:04 crc kubenswrapper[4941]: I0307 07:56:04.457816 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547836-v2xgb" Mar 07 07:56:04 crc kubenswrapper[4941]: I0307 07:56:04.802852 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547830-w4xph"] Mar 07 07:56:04 crc kubenswrapper[4941]: I0307 07:56:04.809722 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547830-w4xph"] Mar 07 07:56:05 crc kubenswrapper[4941]: I0307 07:56:05.968822 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76edbf80-436a-49c3-b1d1-8209d041e0b5" path="/var/lib/kubelet/pods/76edbf80-436a-49c3-b1d1-8209d041e0b5/volumes" Mar 07 07:56:09 crc kubenswrapper[4941]: I0307 07:56:09.698669 4941 scope.go:117] "RemoveContainer" containerID="a89db95ffd9a81d112466b0ffdc0ca0a016fa1e8ffa951ae5e9a5c0903a948b9" Mar 07 07:56:15 crc kubenswrapper[4941]: I0307 07:56:15.954384 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:56:15 crc kubenswrapper[4941]: E0307 07:56:15.955525 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:56:26 crc kubenswrapper[4941]: I0307 07:56:26.954581 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:56:26 crc kubenswrapper[4941]: E0307 07:56:26.956446 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:56:38 crc kubenswrapper[4941]: I0307 07:56:38.955241 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:56:38 crc kubenswrapper[4941]: E0307 07:56:38.956260 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:56:52 crc kubenswrapper[4941]: I0307 07:56:52.954142 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:56:52 crc kubenswrapper[4941]: E0307 07:56:52.954775 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:57:04 crc kubenswrapper[4941]: I0307 07:57:04.954452 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:57:04 crc kubenswrapper[4941]: E0307 07:57:04.955434 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:57:17 crc kubenswrapper[4941]: I0307 07:57:17.954320 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:57:17 crc kubenswrapper[4941]: E0307 07:57:17.955083 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:57:31 crc kubenswrapper[4941]: I0307 07:57:31.954329 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:57:31 crc kubenswrapper[4941]: E0307 07:57:31.955392 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 07:57:42 crc kubenswrapper[4941]: I0307 07:57:42.957672 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 07:57:43 crc kubenswrapper[4941]: I0307 07:57:43.401566 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"a1d0b5eb92e31d5229449e1e28ad7dec12cd8ead59c2438d98b39bdcd41bf81a"} Mar 07 07:58:00 crc kubenswrapper[4941]: I0307 07:58:00.157310 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547838-dz7hs"] Mar 07 07:58:00 crc kubenswrapper[4941]: E0307 07:58:00.158172 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11eb1bfb-3396-4b0d-a07a-55ff9946e2ed" containerName="oc" Mar 07 07:58:00 crc kubenswrapper[4941]: I0307 07:58:00.158188 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="11eb1bfb-3396-4b0d-a07a-55ff9946e2ed" containerName="oc" Mar 07 07:58:00 crc kubenswrapper[4941]: I0307 07:58:00.158387 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="11eb1bfb-3396-4b0d-a07a-55ff9946e2ed" containerName="oc" Mar 07 07:58:00 crc kubenswrapper[4941]: I0307 07:58:00.159507 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547838-dz7hs" Mar 07 07:58:00 crc kubenswrapper[4941]: I0307 07:58:00.161465 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 07:58:00 crc kubenswrapper[4941]: I0307 07:58:00.162699 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:58:00 crc kubenswrapper[4941]: I0307 07:58:00.163143 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:58:00 crc kubenswrapper[4941]: I0307 07:58:00.173082 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547838-dz7hs"] Mar 07 07:58:00 crc kubenswrapper[4941]: I0307 07:58:00.177928 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlq5x\" (UniqueName: \"kubernetes.io/projected/fef60d16-b229-4718-8438-d39698de0607-kube-api-access-dlq5x\") pod \"auto-csr-approver-29547838-dz7hs\" (UID: \"fef60d16-b229-4718-8438-d39698de0607\") " pod="openshift-infra/auto-csr-approver-29547838-dz7hs" Mar 07 07:58:00 crc kubenswrapper[4941]: I0307 07:58:00.278742 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlq5x\" (UniqueName: \"kubernetes.io/projected/fef60d16-b229-4718-8438-d39698de0607-kube-api-access-dlq5x\") pod \"auto-csr-approver-29547838-dz7hs\" (UID: \"fef60d16-b229-4718-8438-d39698de0607\") " pod="openshift-infra/auto-csr-approver-29547838-dz7hs" Mar 07 07:58:00 crc kubenswrapper[4941]: I0307 07:58:00.309021 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlq5x\" (UniqueName: \"kubernetes.io/projected/fef60d16-b229-4718-8438-d39698de0607-kube-api-access-dlq5x\") pod \"auto-csr-approver-29547838-dz7hs\" (UID: \"fef60d16-b229-4718-8438-d39698de0607\") " pod="openshift-infra/auto-csr-approver-29547838-dz7hs" Mar 07 07:58:00 crc kubenswrapper[4941]: I0307 07:58:00.483191 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547838-dz7hs" Mar 07 07:58:00 crc kubenswrapper[4941]: I0307 07:58:00.973807 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547838-dz7hs"] Mar 07 07:58:00 crc kubenswrapper[4941]: W0307 07:58:00.984651 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfef60d16_b229_4718_8438_d39698de0607.slice/crio-b766fa762e8942cee38033ed8792cd06b9588876d5c940a25a7bbabc4c9aceb3 WatchSource:0}: Error finding container b766fa762e8942cee38033ed8792cd06b9588876d5c940a25a7bbabc4c9aceb3: Status 404 returned error can't find the container with id b766fa762e8942cee38033ed8792cd06b9588876d5c940a25a7bbabc4c9aceb3 Mar 07 07:58:01 crc kubenswrapper[4941]: I0307 07:58:01.552937 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547838-dz7hs" event={"ID":"fef60d16-b229-4718-8438-d39698de0607","Type":"ContainerStarted","Data":"b766fa762e8942cee38033ed8792cd06b9588876d5c940a25a7bbabc4c9aceb3"} Mar 07 07:58:02 crc kubenswrapper[4941]: I0307 07:58:02.561009 4941 generic.go:334] "Generic (PLEG): container finished" podID="fef60d16-b229-4718-8438-d39698de0607" containerID="d4173fa55db3f5762a8f5d08e6debfa06e56eaec69b92f34c0f286a8b833008e" exitCode=0 Mar 07 07:58:02 crc kubenswrapper[4941]: I0307 07:58:02.561182 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547838-dz7hs" event={"ID":"fef60d16-b229-4718-8438-d39698de0607","Type":"ContainerDied","Data":"d4173fa55db3f5762a8f5d08e6debfa06e56eaec69b92f34c0f286a8b833008e"} Mar 07 07:58:03 crc kubenswrapper[4941]: I0307 07:58:03.919021 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547838-dz7hs" Mar 07 07:58:03 crc kubenswrapper[4941]: I0307 07:58:03.944004 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlq5x\" (UniqueName: \"kubernetes.io/projected/fef60d16-b229-4718-8438-d39698de0607-kube-api-access-dlq5x\") pod \"fef60d16-b229-4718-8438-d39698de0607\" (UID: \"fef60d16-b229-4718-8438-d39698de0607\") " Mar 07 07:58:03 crc kubenswrapper[4941]: I0307 07:58:03.968088 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fef60d16-b229-4718-8438-d39698de0607-kube-api-access-dlq5x" (OuterVolumeSpecName: "kube-api-access-dlq5x") pod "fef60d16-b229-4718-8438-d39698de0607" (UID: "fef60d16-b229-4718-8438-d39698de0607"). InnerVolumeSpecName "kube-api-access-dlq5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:58:04 crc kubenswrapper[4941]: I0307 07:58:04.046397 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlq5x\" (UniqueName: \"kubernetes.io/projected/fef60d16-b229-4718-8438-d39698de0607-kube-api-access-dlq5x\") on node \"crc\" DevicePath \"\"" Mar 07 07:58:04 crc kubenswrapper[4941]: I0307 07:58:04.580795 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547838-dz7hs" event={"ID":"fef60d16-b229-4718-8438-d39698de0607","Type":"ContainerDied","Data":"b766fa762e8942cee38033ed8792cd06b9588876d5c940a25a7bbabc4c9aceb3"} Mar 07 07:58:04 crc kubenswrapper[4941]: I0307 07:58:04.581129 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b766fa762e8942cee38033ed8792cd06b9588876d5c940a25a7bbabc4c9aceb3" Mar 07 07:58:04 crc kubenswrapper[4941]: I0307 07:58:04.580857 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547838-dz7hs" Mar 07 07:58:05 crc kubenswrapper[4941]: I0307 07:58:05.018344 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547832-nh7dg"] Mar 07 07:58:05 crc kubenswrapper[4941]: I0307 07:58:05.026677 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547832-nh7dg"] Mar 07 07:58:05 crc kubenswrapper[4941]: I0307 07:58:05.965375 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c28b30b-3be6-45e9-b702-6a6d6e1cac86" path="/var/lib/kubelet/pods/7c28b30b-3be6-45e9-b702-6a6d6e1cac86/volumes" Mar 07 07:58:09 crc kubenswrapper[4941]: I0307 07:58:09.799928 4941 scope.go:117] "RemoveContainer" containerID="2f3ef71d4fa836ebf730ae8deb0958a4deb9e558aa17d122d88a39a28eee2811" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.610793 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vtxsf"] Mar 07 07:59:24 crc kubenswrapper[4941]: E0307 07:59:24.613206 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fef60d16-b229-4718-8438-d39698de0607" containerName="oc" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.613391 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef60d16-b229-4718-8438-d39698de0607" containerName="oc" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.613788 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="fef60d16-b229-4718-8438-d39698de0607" containerName="oc" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.617380 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.627060 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtxsf"] Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.698821 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ee8e8d2-e020-4668-9cf7-69f726f86a65-utilities\") pod \"redhat-marketplace-vtxsf\" (UID: \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\") " pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.699043 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ee8e8d2-e020-4668-9cf7-69f726f86a65-catalog-content\") pod \"redhat-marketplace-vtxsf\" (UID: \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\") " pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.699238 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tngt\" (UniqueName: \"kubernetes.io/projected/3ee8e8d2-e020-4668-9cf7-69f726f86a65-kube-api-access-2tngt\") pod \"redhat-marketplace-vtxsf\" (UID: \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\") " pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.801111 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ee8e8d2-e020-4668-9cf7-69f726f86a65-utilities\") pod \"redhat-marketplace-vtxsf\" (UID: \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\") " pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.801208 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ee8e8d2-e020-4668-9cf7-69f726f86a65-catalog-content\") pod \"redhat-marketplace-vtxsf\" (UID: \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\") " pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.801264 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tngt\" (UniqueName: \"kubernetes.io/projected/3ee8e8d2-e020-4668-9cf7-69f726f86a65-kube-api-access-2tngt\") pod \"redhat-marketplace-vtxsf\" (UID: \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\") " pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.801783 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ee8e8d2-e020-4668-9cf7-69f726f86a65-utilities\") pod \"redhat-marketplace-vtxsf\" (UID: \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\") " pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.802343 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ee8e8d2-e020-4668-9cf7-69f726f86a65-catalog-content\") pod \"redhat-marketplace-vtxsf\" (UID: \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\") " pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.836611 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tngt\" (UniqueName: \"kubernetes.io/projected/3ee8e8d2-e020-4668-9cf7-69f726f86a65-kube-api-access-2tngt\") pod \"redhat-marketplace-vtxsf\" (UID: \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\") " pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:24 crc kubenswrapper[4941]: I0307 07:59:24.951248 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:25 crc kubenswrapper[4941]: I0307 07:59:25.511571 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtxsf"] Mar 07 07:59:25 crc kubenswrapper[4941]: W0307 07:59:25.784838 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ee8e8d2_e020_4668_9cf7_69f726f86a65.slice/crio-2ac986330519a18dbab5f4997daf2201002b025fc4eff894f357ad7f1d8310a2 WatchSource:0}: Error finding container 2ac986330519a18dbab5f4997daf2201002b025fc4eff894f357ad7f1d8310a2: Status 404 returned error can't find the container with id 2ac986330519a18dbab5f4997daf2201002b025fc4eff894f357ad7f1d8310a2 Mar 07 07:59:26 crc kubenswrapper[4941]: I0307 07:59:26.259996 4941 generic.go:334] "Generic (PLEG): container finished" podID="3ee8e8d2-e020-4668-9cf7-69f726f86a65" containerID="b7cb0e2879d3401490f9264430ee6630617e96e5cb68680040910ccf0d78563d" exitCode=0 Mar 07 07:59:26 crc kubenswrapper[4941]: I0307 07:59:26.260075 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtxsf" event={"ID":"3ee8e8d2-e020-4668-9cf7-69f726f86a65","Type":"ContainerDied","Data":"b7cb0e2879d3401490f9264430ee6630617e96e5cb68680040910ccf0d78563d"} Mar 07 07:59:26 crc kubenswrapper[4941]: I0307 07:59:26.260455 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtxsf" event={"ID":"3ee8e8d2-e020-4668-9cf7-69f726f86a65","Type":"ContainerStarted","Data":"2ac986330519a18dbab5f4997daf2201002b025fc4eff894f357ad7f1d8310a2"} Mar 07 07:59:26 crc kubenswrapper[4941]: I0307 07:59:26.261646 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:59:27 crc kubenswrapper[4941]: I0307 07:59:27.279312 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtxsf" event={"ID":"3ee8e8d2-e020-4668-9cf7-69f726f86a65","Type":"ContainerStarted","Data":"bcfcb4cb5e95ae2f1a4c96d84624b5d4d18e0d07e18dce954a9a7945aa79b096"} Mar 07 07:59:28 crc kubenswrapper[4941]: I0307 07:59:28.291919 4941 generic.go:334] "Generic (PLEG): container finished" podID="3ee8e8d2-e020-4668-9cf7-69f726f86a65" containerID="bcfcb4cb5e95ae2f1a4c96d84624b5d4d18e0d07e18dce954a9a7945aa79b096" exitCode=0 Mar 07 07:59:28 crc kubenswrapper[4941]: I0307 07:59:28.291986 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtxsf" event={"ID":"3ee8e8d2-e020-4668-9cf7-69f726f86a65","Type":"ContainerDied","Data":"bcfcb4cb5e95ae2f1a4c96d84624b5d4d18e0d07e18dce954a9a7945aa79b096"} Mar 07 07:59:29 crc kubenswrapper[4941]: I0307 07:59:29.305273 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtxsf" event={"ID":"3ee8e8d2-e020-4668-9cf7-69f726f86a65","Type":"ContainerStarted","Data":"2e644a66f8e6246832d7d8536c87d78f05e7bd01ea7baf80738b5da911aaaf88"} Mar 07 07:59:29 crc kubenswrapper[4941]: I0307 07:59:29.345315 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vtxsf" podStartSLOduration=2.928917329 podStartE2EDuration="5.345289244s" podCreationTimestamp="2026-03-07 07:59:24 +0000 UTC" firstStartedPulling="2026-03-07 07:59:26.261458968 +0000 UTC m=+4063.213824433" lastFinishedPulling="2026-03-07 07:59:28.677830843 +0000 UTC m=+4065.630196348" observedRunningTime="2026-03-07 07:59:29.335465749 +0000 UTC m=+4066.287831254" watchObservedRunningTime="2026-03-07 07:59:29.345289244 +0000 UTC m=+4066.297654739" Mar 07 07:59:34 crc kubenswrapper[4941]: I0307 07:59:34.952484 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:34 crc kubenswrapper[4941]: I0307 07:59:34.953191 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:35 crc kubenswrapper[4941]: I0307 07:59:35.016435 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:35 crc kubenswrapper[4941]: I0307 07:59:35.845251 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:35 crc kubenswrapper[4941]: I0307 07:59:35.914020 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtxsf"] Mar 07 07:59:37 crc kubenswrapper[4941]: I0307 07:59:37.597451 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vtxsf" podUID="3ee8e8d2-e020-4668-9cf7-69f726f86a65" containerName="registry-server" containerID="cri-o://2e644a66f8e6246832d7d8536c87d78f05e7bd01ea7baf80738b5da911aaaf88" gracePeriod=2 Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.251206 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.412127 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ee8e8d2-e020-4668-9cf7-69f726f86a65-utilities\") pod \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\" (UID: \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\") " Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.412220 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ee8e8d2-e020-4668-9cf7-69f726f86a65-catalog-content\") pod \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\" (UID: \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\") " Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.412265 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tngt\" (UniqueName: \"kubernetes.io/projected/3ee8e8d2-e020-4668-9cf7-69f726f86a65-kube-api-access-2tngt\") pod \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\" (UID: \"3ee8e8d2-e020-4668-9cf7-69f726f86a65\") " Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.413395 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ee8e8d2-e020-4668-9cf7-69f726f86a65-utilities" (OuterVolumeSpecName: "utilities") pod "3ee8e8d2-e020-4668-9cf7-69f726f86a65" (UID: "3ee8e8d2-e020-4668-9cf7-69f726f86a65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.421648 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ee8e8d2-e020-4668-9cf7-69f726f86a65-kube-api-access-2tngt" (OuterVolumeSpecName: "kube-api-access-2tngt") pod "3ee8e8d2-e020-4668-9cf7-69f726f86a65" (UID: "3ee8e8d2-e020-4668-9cf7-69f726f86a65"). InnerVolumeSpecName "kube-api-access-2tngt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.449089 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ee8e8d2-e020-4668-9cf7-69f726f86a65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ee8e8d2-e020-4668-9cf7-69f726f86a65" (UID: "3ee8e8d2-e020-4668-9cf7-69f726f86a65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.513775 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ee8e8d2-e020-4668-9cf7-69f726f86a65-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.513816 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tngt\" (UniqueName: \"kubernetes.io/projected/3ee8e8d2-e020-4668-9cf7-69f726f86a65-kube-api-access-2tngt\") on node \"crc\" DevicePath \"\"" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.513829 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ee8e8d2-e020-4668-9cf7-69f726f86a65-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.610878 4941 generic.go:334] "Generic (PLEG): container finished" podID="3ee8e8d2-e020-4668-9cf7-69f726f86a65" containerID="2e644a66f8e6246832d7d8536c87d78f05e7bd01ea7baf80738b5da911aaaf88" exitCode=0 Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.610934 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtxsf" event={"ID":"3ee8e8d2-e020-4668-9cf7-69f726f86a65","Type":"ContainerDied","Data":"2e644a66f8e6246832d7d8536c87d78f05e7bd01ea7baf80738b5da911aaaf88"} Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.610975 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtxsf" event={"ID":"3ee8e8d2-e020-4668-9cf7-69f726f86a65","Type":"ContainerDied","Data":"2ac986330519a18dbab5f4997daf2201002b025fc4eff894f357ad7f1d8310a2"} Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.610974 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtxsf" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.611022 4941 scope.go:117] "RemoveContainer" containerID="2e644a66f8e6246832d7d8536c87d78f05e7bd01ea7baf80738b5da911aaaf88" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.643502 4941 scope.go:117] "RemoveContainer" containerID="bcfcb4cb5e95ae2f1a4c96d84624b5d4d18e0d07e18dce954a9a7945aa79b096" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.687005 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtxsf"] Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.699100 4941 scope.go:117] "RemoveContainer" containerID="b7cb0e2879d3401490f9264430ee6630617e96e5cb68680040910ccf0d78563d" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.703288 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtxsf"] Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.715493 4941 scope.go:117] "RemoveContainer" containerID="2e644a66f8e6246832d7d8536c87d78f05e7bd01ea7baf80738b5da911aaaf88" Mar 07 07:59:38 crc kubenswrapper[4941]: E0307 07:59:38.716012 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e644a66f8e6246832d7d8536c87d78f05e7bd01ea7baf80738b5da911aaaf88\": container with ID starting with 2e644a66f8e6246832d7d8536c87d78f05e7bd01ea7baf80738b5da911aaaf88 not found: ID does not exist" containerID="2e644a66f8e6246832d7d8536c87d78f05e7bd01ea7baf80738b5da911aaaf88" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.716125 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e644a66f8e6246832d7d8536c87d78f05e7bd01ea7baf80738b5da911aaaf88"} err="failed to get container status \"2e644a66f8e6246832d7d8536c87d78f05e7bd01ea7baf80738b5da911aaaf88\": rpc error: code = NotFound desc = could not find container \"2e644a66f8e6246832d7d8536c87d78f05e7bd01ea7baf80738b5da911aaaf88\": container with ID starting with 2e644a66f8e6246832d7d8536c87d78f05e7bd01ea7baf80738b5da911aaaf88 not found: ID does not exist" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.716232 4941 scope.go:117] "RemoveContainer" containerID="bcfcb4cb5e95ae2f1a4c96d84624b5d4d18e0d07e18dce954a9a7945aa79b096" Mar 07 07:59:38 crc kubenswrapper[4941]: E0307 07:59:38.716636 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcfcb4cb5e95ae2f1a4c96d84624b5d4d18e0d07e18dce954a9a7945aa79b096\": container with ID starting with bcfcb4cb5e95ae2f1a4c96d84624b5d4d18e0d07e18dce954a9a7945aa79b096 not found: ID does not exist" containerID="bcfcb4cb5e95ae2f1a4c96d84624b5d4d18e0d07e18dce954a9a7945aa79b096" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.716681 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcfcb4cb5e95ae2f1a4c96d84624b5d4d18e0d07e18dce954a9a7945aa79b096"} err="failed to get container status \"bcfcb4cb5e95ae2f1a4c96d84624b5d4d18e0d07e18dce954a9a7945aa79b096\": rpc error: code = NotFound desc = could not find container \"bcfcb4cb5e95ae2f1a4c96d84624b5d4d18e0d07e18dce954a9a7945aa79b096\": container with ID starting with bcfcb4cb5e95ae2f1a4c96d84624b5d4d18e0d07e18dce954a9a7945aa79b096 not found: ID does not exist" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.716695 4941 scope.go:117] "RemoveContainer" containerID="b7cb0e2879d3401490f9264430ee6630617e96e5cb68680040910ccf0d78563d" Mar 07 07:59:38 crc kubenswrapper[4941]: E0307 07:59:38.717015 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7cb0e2879d3401490f9264430ee6630617e96e5cb68680040910ccf0d78563d\": container with ID starting with b7cb0e2879d3401490f9264430ee6630617e96e5cb68680040910ccf0d78563d not found: ID does not exist" containerID="b7cb0e2879d3401490f9264430ee6630617e96e5cb68680040910ccf0d78563d" Mar 07 07:59:38 crc kubenswrapper[4941]: I0307 07:59:38.717062 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7cb0e2879d3401490f9264430ee6630617e96e5cb68680040910ccf0d78563d"} err="failed to get container status \"b7cb0e2879d3401490f9264430ee6630617e96e5cb68680040910ccf0d78563d\": rpc error: code = NotFound desc = could not find container \"b7cb0e2879d3401490f9264430ee6630617e96e5cb68680040910ccf0d78563d\": container with ID starting with b7cb0e2879d3401490f9264430ee6630617e96e5cb68680040910ccf0d78563d not found: ID does not exist" Mar 07 07:59:39 crc kubenswrapper[4941]: I0307 07:59:39.969752 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ee8e8d2-e020-4668-9cf7-69f726f86a65" path="/var/lib/kubelet/pods/3ee8e8d2-e020-4668-9cf7-69f726f86a65/volumes" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.642522 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xfqtp"] Mar 07 07:59:44 crc kubenswrapper[4941]: E0307 07:59:44.643182 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee8e8d2-e020-4668-9cf7-69f726f86a65" containerName="extract-content" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.643200 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee8e8d2-e020-4668-9cf7-69f726f86a65" containerName="extract-content" Mar 07 07:59:44 crc kubenswrapper[4941]: E0307 07:59:44.643216 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee8e8d2-e020-4668-9cf7-69f726f86a65" containerName="extract-utilities" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.643224 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee8e8d2-e020-4668-9cf7-69f726f86a65" containerName="extract-utilities" Mar 07 07:59:44 crc kubenswrapper[4941]: E0307 07:59:44.643245 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee8e8d2-e020-4668-9cf7-69f726f86a65" containerName="registry-server" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.643253 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee8e8d2-e020-4668-9cf7-69f726f86a65" containerName="registry-server" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.643444 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ee8e8d2-e020-4668-9cf7-69f726f86a65" containerName="registry-server" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.644698 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.662671 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xfqtp"] Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.812951 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6d1f0bc-ce0a-48aa-b729-0d287217f619-utilities\") pod \"redhat-operators-xfqtp\" (UID: \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\") " pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.813039 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6d1f0bc-ce0a-48aa-b729-0d287217f619-catalog-content\") pod \"redhat-operators-xfqtp\" (UID: \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\") " pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.813070 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tzzn\" (UniqueName: \"kubernetes.io/projected/f6d1f0bc-ce0a-48aa-b729-0d287217f619-kube-api-access-7tzzn\") pod \"redhat-operators-xfqtp\" (UID: \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\") " pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.917130 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6d1f0bc-ce0a-48aa-b729-0d287217f619-catalog-content\") pod \"redhat-operators-xfqtp\" (UID: \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\") " pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.917198 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tzzn\" (UniqueName: \"kubernetes.io/projected/f6d1f0bc-ce0a-48aa-b729-0d287217f619-kube-api-access-7tzzn\") pod \"redhat-operators-xfqtp\" (UID: \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\") " pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.917288 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6d1f0bc-ce0a-48aa-b729-0d287217f619-utilities\") pod \"redhat-operators-xfqtp\" (UID: \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\") " pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.917819 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6d1f0bc-ce0a-48aa-b729-0d287217f619-utilities\") pod \"redhat-operators-xfqtp\" (UID: \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\") " pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.918087 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6d1f0bc-ce0a-48aa-b729-0d287217f619-catalog-content\") pod \"redhat-operators-xfqtp\" (UID: \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\") " pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.954849 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tzzn\" (UniqueName: \"kubernetes.io/projected/f6d1f0bc-ce0a-48aa-b729-0d287217f619-kube-api-access-7tzzn\") pod \"redhat-operators-xfqtp\" (UID: \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\") " pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 07:59:44 crc kubenswrapper[4941]: I0307 07:59:44.970764 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 07:59:45 crc kubenswrapper[4941]: I0307 07:59:45.224501 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xfqtp"] Mar 07 07:59:45 crc kubenswrapper[4941]: I0307 07:59:45.669791 4941 generic.go:334] "Generic (PLEG): container finished" podID="f6d1f0bc-ce0a-48aa-b729-0d287217f619" containerID="150a5eb7565dffb4af5ef091f941516421a407634bc05683d613d81fcf0e6c9f" exitCode=0 Mar 07 07:59:45 crc kubenswrapper[4941]: I0307 07:59:45.669859 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfqtp" event={"ID":"f6d1f0bc-ce0a-48aa-b729-0d287217f619","Type":"ContainerDied","Data":"150a5eb7565dffb4af5ef091f941516421a407634bc05683d613d81fcf0e6c9f"} Mar 07 07:59:45 crc kubenswrapper[4941]: I0307 07:59:45.669900 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfqtp" event={"ID":"f6d1f0bc-ce0a-48aa-b729-0d287217f619","Type":"ContainerStarted","Data":"da9d1cdaac40fe8d30b9cbdf4016a8625c8531adb7902f6176117e0b981c3d44"} Mar 07 07:59:46 crc kubenswrapper[4941]: I0307 07:59:46.692202 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfqtp" event={"ID":"f6d1f0bc-ce0a-48aa-b729-0d287217f619","Type":"ContainerStarted","Data":"ac00f22be00a8673bc696ba28791d195794aca310456e57c50b51dbb5eefd108"} Mar 07 07:59:47 crc kubenswrapper[4941]: I0307 07:59:47.702932 4941 generic.go:334] "Generic (PLEG): container finished" podID="f6d1f0bc-ce0a-48aa-b729-0d287217f619" containerID="ac00f22be00a8673bc696ba28791d195794aca310456e57c50b51dbb5eefd108" exitCode=0 Mar 07 07:59:47 crc kubenswrapper[4941]: I0307 07:59:47.703003 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfqtp" event={"ID":"f6d1f0bc-ce0a-48aa-b729-0d287217f619","Type":"ContainerDied","Data":"ac00f22be00a8673bc696ba28791d195794aca310456e57c50b51dbb5eefd108"} Mar 07 07:59:48 crc kubenswrapper[4941]: I0307 07:59:48.713764 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfqtp" event={"ID":"f6d1f0bc-ce0a-48aa-b729-0d287217f619","Type":"ContainerStarted","Data":"68c21b0efc4b4d8bb24ca0c93f4798581eae9b8bbaff79f90f1810ffdd65603e"} Mar 07 07:59:48 crc kubenswrapper[4941]: I0307 07:59:48.736311 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xfqtp" podStartSLOduration=2.26519306 podStartE2EDuration="4.736288325s" podCreationTimestamp="2026-03-07 07:59:44 +0000 UTC" firstStartedPulling="2026-03-07 07:59:45.671822831 +0000 UTC m=+4082.624188306" lastFinishedPulling="2026-03-07 07:59:48.142918066 +0000 UTC m=+4085.095283571" observedRunningTime="2026-03-07 07:59:48.730835068 +0000 UTC m=+4085.683200543" watchObservedRunningTime="2026-03-07 07:59:48.736288325 +0000 UTC m=+4085.688653790" Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.005812 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nvcsn"] Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.007859 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvcsn" Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.019381 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvcsn"] Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.115203 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a8df16-342b-472f-b696-ea5c595954d2-utilities\") pod \"community-operators-nvcsn\" (UID: \"d0a8df16-342b-472f-b696-ea5c595954d2\") " pod="openshift-marketplace/community-operators-nvcsn" Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.115341 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a8df16-342b-472f-b696-ea5c595954d2-catalog-content\") pod \"community-operators-nvcsn\" (UID: \"d0a8df16-342b-472f-b696-ea5c595954d2\") " pod="openshift-marketplace/community-operators-nvcsn" Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.115363 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpmqq\" (UniqueName: \"kubernetes.io/projected/d0a8df16-342b-472f-b696-ea5c595954d2-kube-api-access-bpmqq\") pod \"community-operators-nvcsn\" (UID: \"d0a8df16-342b-472f-b696-ea5c595954d2\") " pod="openshift-marketplace/community-operators-nvcsn" Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.216310 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a8df16-342b-472f-b696-ea5c595954d2-catalog-content\") pod \"community-operators-nvcsn\" (UID: \"d0a8df16-342b-472f-b696-ea5c595954d2\") " pod="openshift-marketplace/community-operators-nvcsn" Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.216371 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpmqq\" (UniqueName: \"kubernetes.io/projected/d0a8df16-342b-472f-b696-ea5c595954d2-kube-api-access-bpmqq\") pod \"community-operators-nvcsn\" (UID: \"d0a8df16-342b-472f-b696-ea5c595954d2\") " pod="openshift-marketplace/community-operators-nvcsn" Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.216428 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a8df16-342b-472f-b696-ea5c595954d2-utilities\") pod \"community-operators-nvcsn\" (UID: \"d0a8df16-342b-472f-b696-ea5c595954d2\") " pod="openshift-marketplace/community-operators-nvcsn" Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.216990 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a8df16-342b-472f-b696-ea5c595954d2-catalog-content\") pod \"community-operators-nvcsn\" (UID: \"d0a8df16-342b-472f-b696-ea5c595954d2\") " pod="openshift-marketplace/community-operators-nvcsn" Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.217121 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a8df16-342b-472f-b696-ea5c595954d2-utilities\") pod \"community-operators-nvcsn\" (UID: \"d0a8df16-342b-472f-b696-ea5c595954d2\") " pod="openshift-marketplace/community-operators-nvcsn" Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.244361 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpmqq\" (UniqueName: \"kubernetes.io/projected/d0a8df16-342b-472f-b696-ea5c595954d2-kube-api-access-bpmqq\") pod \"community-operators-nvcsn\" (UID: \"d0a8df16-342b-472f-b696-ea5c595954d2\") " pod="openshift-marketplace/community-operators-nvcsn" Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.334235 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvcsn" Mar 07 07:59:51 crc kubenswrapper[4941]: I0307 07:59:51.874352 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvcsn"] Mar 07 07:59:51 crc kubenswrapper[4941]: W0307 07:59:51.876374 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0a8df16_342b_472f_b696_ea5c595954d2.slice/crio-31101a16441e255a505545d41904a59449e501952fa8736d1309e8bc050f3e02 WatchSource:0}: Error finding container 31101a16441e255a505545d41904a59449e501952fa8736d1309e8bc050f3e02: Status 404 returned error can't find the container with id 31101a16441e255a505545d41904a59449e501952fa8736d1309e8bc050f3e02 Mar 07 07:59:52 crc kubenswrapper[4941]: I0307 07:59:52.747700 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcsn" event={"ID":"d0a8df16-342b-472f-b696-ea5c595954d2","Type":"ContainerStarted","Data":"8f0be642bac65857ddac9b213156920005d18eb7f99bfdc383fcc5b6a3948a57"} Mar 07 07:59:52 crc kubenswrapper[4941]: I0307 07:59:52.747995 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcsn" event={"ID":"d0a8df16-342b-472f-b696-ea5c595954d2","Type":"ContainerStarted","Data":"31101a16441e255a505545d41904a59449e501952fa8736d1309e8bc050f3e02"} Mar 07 07:59:53 crc kubenswrapper[4941]: I0307 07:59:53.759660 4941 generic.go:334] "Generic (PLEG): container finished" podID="d0a8df16-342b-472f-b696-ea5c595954d2" containerID="8f0be642bac65857ddac9b213156920005d18eb7f99bfdc383fcc5b6a3948a57" exitCode=0 Mar 07 07:59:53 crc kubenswrapper[4941]: I0307 07:59:53.759765 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcsn" event={"ID":"d0a8df16-342b-472f-b696-ea5c595954d2","Type":"ContainerDied","Data":"8f0be642bac65857ddac9b213156920005d18eb7f99bfdc383fcc5b6a3948a57"} Mar 07 07:59:54 crc kubenswrapper[4941]: I0307 07:59:54.772532 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcsn" event={"ID":"d0a8df16-342b-472f-b696-ea5c595954d2","Type":"ContainerStarted","Data":"a29407690c4810603abca7eb9f06d3a712e55f99ecf5ddc2c94f211596ad33bf"} Mar 07 07:59:54 crc kubenswrapper[4941]: I0307 07:59:54.971904 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 07:59:54 crc kubenswrapper[4941]: I0307 07:59:54.971999 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 07:59:55 crc kubenswrapper[4941]: I0307 07:59:55.781594 4941 generic.go:334] "Generic (PLEG): container finished" podID="d0a8df16-342b-472f-b696-ea5c595954d2" containerID="a29407690c4810603abca7eb9f06d3a712e55f99ecf5ddc2c94f211596ad33bf" exitCode=0 Mar 07 07:59:55 crc kubenswrapper[4941]: I0307 07:59:55.781634 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcsn" event={"ID":"d0a8df16-342b-472f-b696-ea5c595954d2","Type":"ContainerDied","Data":"a29407690c4810603abca7eb9f06d3a712e55f99ecf5ddc2c94f211596ad33bf"} Mar 07 07:59:56 crc kubenswrapper[4941]: I0307 07:59:56.047313 4941 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xfqtp" podUID="f6d1f0bc-ce0a-48aa-b729-0d287217f619" containerName="registry-server" probeResult="failure" output=< Mar 07 07:59:56 crc kubenswrapper[4941]: timeout: failed to connect service ":50051" within 1s Mar 07 07:59:56 crc kubenswrapper[4941]: > Mar 07 07:59:56 crc kubenswrapper[4941]: I0307 07:59:56.792488 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcsn" event={"ID":"d0a8df16-342b-472f-b696-ea5c595954d2","Type":"ContainerStarted","Data":"9b2875eceb819dd031944e1cb34dd4f59f26ae1a7fa8f1dd75fd945c1585bde9"} Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.146552 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nvcsn" podStartSLOduration=7.541297263 podStartE2EDuration="10.146527512s" podCreationTimestamp="2026-03-07 07:59:50 +0000 UTC" firstStartedPulling="2026-03-07 07:59:53.763538892 +0000 UTC m=+4090.715904397" lastFinishedPulling="2026-03-07 07:59:56.368769171 +0000 UTC m=+4093.321134646" observedRunningTime="2026-03-07 07:59:56.822686112 +0000 UTC m=+4093.775051587" watchObservedRunningTime="2026-03-07 08:00:00.146527512 +0000 UTC m=+4097.098892997" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.151948 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547840-hbmbh"] Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.154058 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547840-hbmbh" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.160100 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.160330 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.160970 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.175355 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc"] Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.176621 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.178802 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.181251 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547840-hbmbh"] Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.185078 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.188071 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc"] Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.353273 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxwn5\" (UniqueName: \"kubernetes.io/projected/60d00e10-29f3-4800-9974-ab73ec201798-kube-api-access-pxwn5\") pod \"collect-profiles-29547840-r4fmc\" (UID: \"60d00e10-29f3-4800-9974-ab73ec201798\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.353845 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60d00e10-29f3-4800-9974-ab73ec201798-config-volume\") pod \"collect-profiles-29547840-r4fmc\" (UID: \"60d00e10-29f3-4800-9974-ab73ec201798\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.353884 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60d00e10-29f3-4800-9974-ab73ec201798-secret-volume\") pod \"collect-profiles-29547840-r4fmc\" (UID: \"60d00e10-29f3-4800-9974-ab73ec201798\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.353927 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhxrl\" (UniqueName: \"kubernetes.io/projected/da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3-kube-api-access-vhxrl\") pod \"auto-csr-approver-29547840-hbmbh\" (UID: \"da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3\") " pod="openshift-infra/auto-csr-approver-29547840-hbmbh" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.455261 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60d00e10-29f3-4800-9974-ab73ec201798-config-volume\") pod \"collect-profiles-29547840-r4fmc\" (UID: \"60d00e10-29f3-4800-9974-ab73ec201798\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.455308 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60d00e10-29f3-4800-9974-ab73ec201798-secret-volume\") pod \"collect-profiles-29547840-r4fmc\" (UID: \"60d00e10-29f3-4800-9974-ab73ec201798\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.455330 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhxrl\" (UniqueName: \"kubernetes.io/projected/da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3-kube-api-access-vhxrl\") pod \"auto-csr-approver-29547840-hbmbh\" (UID: \"da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3\") " pod="openshift-infra/auto-csr-approver-29547840-hbmbh" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.455354 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxwn5\" (UniqueName: \"kubernetes.io/projected/60d00e10-29f3-4800-9974-ab73ec201798-kube-api-access-pxwn5\") pod \"collect-profiles-29547840-r4fmc\" (UID: \"60d00e10-29f3-4800-9974-ab73ec201798\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.456616 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60d00e10-29f3-4800-9974-ab73ec201798-config-volume\") pod \"collect-profiles-29547840-r4fmc\" (UID: \"60d00e10-29f3-4800-9974-ab73ec201798\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.468183 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60d00e10-29f3-4800-9974-ab73ec201798-secret-volume\") pod \"collect-profiles-29547840-r4fmc\" (UID: \"60d00e10-29f3-4800-9974-ab73ec201798\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.477046 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxwn5\" (UniqueName: \"kubernetes.io/projected/60d00e10-29f3-4800-9974-ab73ec201798-kube-api-access-pxwn5\") pod \"collect-profiles-29547840-r4fmc\" (UID: \"60d00e10-29f3-4800-9974-ab73ec201798\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.484149 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhxrl\" (UniqueName: \"kubernetes.io/projected/da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3-kube-api-access-vhxrl\") pod \"auto-csr-approver-29547840-hbmbh\" (UID: \"da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3\") " pod="openshift-infra/auto-csr-approver-29547840-hbmbh" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.496792 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.783535 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547840-hbmbh" Mar 07 08:00:00 crc kubenswrapper[4941]: I0307 08:00:00.920882 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc"] Mar 07 08:00:01 crc kubenswrapper[4941]: I0307 08:00:01.088237 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547840-hbmbh"] Mar 07 08:00:01 crc kubenswrapper[4941]: W0307 08:00:01.097746 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda4abbf1_9c5b_479c_bfc6_5e8cf7da16a3.slice/crio-e8e7893c5426d3c7c1d807e1145d3ff222a997165b033a0c89e44db9f78b7c2c WatchSource:0}: Error finding container e8e7893c5426d3c7c1d807e1145d3ff222a997165b033a0c89e44db9f78b7c2c: Status 404 returned error can't find the container with id e8e7893c5426d3c7c1d807e1145d3ff222a997165b033a0c89e44db9f78b7c2c Mar 07 08:00:01 crc kubenswrapper[4941]: I0307 08:00:01.334585 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nvcsn" Mar 07 08:00:01 crc kubenswrapper[4941]: I0307 08:00:01.334787 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nvcsn" Mar 07 08:00:01 crc kubenswrapper[4941]: I0307 08:00:01.375758 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nvcsn" Mar 07 08:00:01 crc kubenswrapper[4941]: I0307 08:00:01.846988 4941 generic.go:334] "Generic (PLEG): container finished" podID="60d00e10-29f3-4800-9974-ab73ec201798" containerID="425ed38f851f851a2fae0646dfeeb7666b0bf8e29e73f98516ecc282d1264bc8" exitCode=0 Mar 07 08:00:01 crc kubenswrapper[4941]: I0307 08:00:01.847079 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" event={"ID":"60d00e10-29f3-4800-9974-ab73ec201798","Type":"ContainerDied","Data":"425ed38f851f851a2fae0646dfeeb7666b0bf8e29e73f98516ecc282d1264bc8"} Mar 07 08:00:01 crc kubenswrapper[4941]: I0307 08:00:01.847348 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" event={"ID":"60d00e10-29f3-4800-9974-ab73ec201798","Type":"ContainerStarted","Data":"2ca1da9f7c87e8ce40967f70e2e90ba5ded126d8bd2ca775b16a13233de13f64"} Mar 07 08:00:01 crc kubenswrapper[4941]: I0307 08:00:01.848742 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547840-hbmbh" event={"ID":"da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3","Type":"ContainerStarted","Data":"e8e7893c5426d3c7c1d807e1145d3ff222a997165b033a0c89e44db9f78b7c2c"} Mar 07 08:00:01 crc kubenswrapper[4941]: I0307 08:00:01.909882 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nvcsn" Mar 07 08:00:01 crc kubenswrapper[4941]: I0307 08:00:01.964206 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nvcsn"] Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.122073 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.304207 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60d00e10-29f3-4800-9974-ab73ec201798-config-volume\") pod \"60d00e10-29f3-4800-9974-ab73ec201798\" (UID: \"60d00e10-29f3-4800-9974-ab73ec201798\") " Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.304271 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxwn5\" (UniqueName: \"kubernetes.io/projected/60d00e10-29f3-4800-9974-ab73ec201798-kube-api-access-pxwn5\") pod \"60d00e10-29f3-4800-9974-ab73ec201798\" (UID: \"60d00e10-29f3-4800-9974-ab73ec201798\") " Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.304357 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60d00e10-29f3-4800-9974-ab73ec201798-secret-volume\") pod \"60d00e10-29f3-4800-9974-ab73ec201798\" (UID: \"60d00e10-29f3-4800-9974-ab73ec201798\") " Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.304940 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60d00e10-29f3-4800-9974-ab73ec201798-config-volume" (OuterVolumeSpecName: "config-volume") pod "60d00e10-29f3-4800-9974-ab73ec201798" (UID: "60d00e10-29f3-4800-9974-ab73ec201798"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.311555 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d00e10-29f3-4800-9974-ab73ec201798-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "60d00e10-29f3-4800-9974-ab73ec201798" (UID: "60d00e10-29f3-4800-9974-ab73ec201798"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.311780 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60d00e10-29f3-4800-9974-ab73ec201798-kube-api-access-pxwn5" (OuterVolumeSpecName: "kube-api-access-pxwn5") pod "60d00e10-29f3-4800-9974-ab73ec201798" (UID: "60d00e10-29f3-4800-9974-ab73ec201798"). InnerVolumeSpecName "kube-api-access-pxwn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.405956 4941 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60d00e10-29f3-4800-9974-ab73ec201798-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.406214 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxwn5\" (UniqueName: \"kubernetes.io/projected/60d00e10-29f3-4800-9974-ab73ec201798-kube-api-access-pxwn5\") on node \"crc\" DevicePath \"\"" Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.406286 4941 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60d00e10-29f3-4800-9974-ab73ec201798-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.867668 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" event={"ID":"60d00e10-29f3-4800-9974-ab73ec201798","Type":"ContainerDied","Data":"2ca1da9f7c87e8ce40967f70e2e90ba5ded126d8bd2ca775b16a13233de13f64"} Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.867725 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547840-r4fmc" Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.867748 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ca1da9f7c87e8ce40967f70e2e90ba5ded126d8bd2ca775b16a13233de13f64" Mar 07 08:00:03 crc kubenswrapper[4941]: I0307 08:00:03.867819 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nvcsn" podUID="d0a8df16-342b-472f-b696-ea5c595954d2" containerName="registry-server" containerID="cri-o://9b2875eceb819dd031944e1cb34dd4f59f26ae1a7fa8f1dd75fd945c1585bde9" gracePeriod=2 Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.218524 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn"] Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.224328 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547795-c47qn"] Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.288344 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvcsn" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.320122 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a8df16-342b-472f-b696-ea5c595954d2-utilities\") pod \"d0a8df16-342b-472f-b696-ea5c595954d2\" (UID: \"d0a8df16-342b-472f-b696-ea5c595954d2\") " Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.320197 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a8df16-342b-472f-b696-ea5c595954d2-catalog-content\") pod \"d0a8df16-342b-472f-b696-ea5c595954d2\" (UID: \"d0a8df16-342b-472f-b696-ea5c595954d2\") " Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.320229 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpmqq\" (UniqueName: \"kubernetes.io/projected/d0a8df16-342b-472f-b696-ea5c595954d2-kube-api-access-bpmqq\") pod \"d0a8df16-342b-472f-b696-ea5c595954d2\" (UID: \"d0a8df16-342b-472f-b696-ea5c595954d2\") " Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.321372 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a8df16-342b-472f-b696-ea5c595954d2-utilities" (OuterVolumeSpecName: "utilities") pod "d0a8df16-342b-472f-b696-ea5c595954d2" (UID: "d0a8df16-342b-472f-b696-ea5c595954d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.364615 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a8df16-342b-472f-b696-ea5c595954d2-kube-api-access-bpmqq" (OuterVolumeSpecName: "kube-api-access-bpmqq") pod "d0a8df16-342b-472f-b696-ea5c595954d2" (UID: "d0a8df16-342b-472f-b696-ea5c595954d2"). InnerVolumeSpecName "kube-api-access-bpmqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.372978 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a8df16-342b-472f-b696-ea5c595954d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0a8df16-342b-472f-b696-ea5c595954d2" (UID: "d0a8df16-342b-472f-b696-ea5c595954d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.421877 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a8df16-342b-472f-b696-ea5c595954d2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.421911 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpmqq\" (UniqueName: \"kubernetes.io/projected/d0a8df16-342b-472f-b696-ea5c595954d2-kube-api-access-bpmqq\") on node \"crc\" DevicePath \"\"" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.421929 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a8df16-342b-472f-b696-ea5c595954d2-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.880202 4941 generic.go:334] "Generic (PLEG): container finished" podID="da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3" containerID="553617ba9c1cf8052d5a417a00a0ad25601358b956365d56911a9d0d23cba7dd" exitCode=0 Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.880283 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547840-hbmbh" event={"ID":"da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3","Type":"ContainerDied","Data":"553617ba9c1cf8052d5a417a00a0ad25601358b956365d56911a9d0d23cba7dd"} Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.884025 4941 generic.go:334] "Generic (PLEG): container finished" podID="d0a8df16-342b-472f-b696-ea5c595954d2" containerID="9b2875eceb819dd031944e1cb34dd4f59f26ae1a7fa8f1dd75fd945c1585bde9" exitCode=0 Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.884093 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcsn" event={"ID":"d0a8df16-342b-472f-b696-ea5c595954d2","Type":"ContainerDied","Data":"9b2875eceb819dd031944e1cb34dd4f59f26ae1a7fa8f1dd75fd945c1585bde9"} Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.884136 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvcsn" event={"ID":"d0a8df16-342b-472f-b696-ea5c595954d2","Type":"ContainerDied","Data":"31101a16441e255a505545d41904a59449e501952fa8736d1309e8bc050f3e02"} Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.884173 4941 scope.go:117] "RemoveContainer" containerID="9b2875eceb819dd031944e1cb34dd4f59f26ae1a7fa8f1dd75fd945c1585bde9" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.884383 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvcsn" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.908123 4941 scope.go:117] "RemoveContainer" containerID="a29407690c4810603abca7eb9f06d3a712e55f99ecf5ddc2c94f211596ad33bf" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.947025 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nvcsn"] Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.954746 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nvcsn"] Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.955052 4941 scope.go:117] "RemoveContainer" containerID="8f0be642bac65857ddac9b213156920005d18eb7f99bfdc383fcc5b6a3948a57" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.982595 4941 scope.go:117] "RemoveContainer" containerID="9b2875eceb819dd031944e1cb34dd4f59f26ae1a7fa8f1dd75fd945c1585bde9" Mar 07 08:00:04 crc kubenswrapper[4941]: E0307 08:00:04.983036 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b2875eceb819dd031944e1cb34dd4f59f26ae1a7fa8f1dd75fd945c1585bde9\": container with ID starting with 9b2875eceb819dd031944e1cb34dd4f59f26ae1a7fa8f1dd75fd945c1585bde9 not found: ID does not exist" containerID="9b2875eceb819dd031944e1cb34dd4f59f26ae1a7fa8f1dd75fd945c1585bde9" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.983067 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b2875eceb819dd031944e1cb34dd4f59f26ae1a7fa8f1dd75fd945c1585bde9"} err="failed to get container status \"9b2875eceb819dd031944e1cb34dd4f59f26ae1a7fa8f1dd75fd945c1585bde9\": rpc error: code = NotFound desc = could not find container \"9b2875eceb819dd031944e1cb34dd4f59f26ae1a7fa8f1dd75fd945c1585bde9\": container with ID starting with 9b2875eceb819dd031944e1cb34dd4f59f26ae1a7fa8f1dd75fd945c1585bde9 not found: ID does not exist" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.983092 4941 scope.go:117] "RemoveContainer" containerID="a29407690c4810603abca7eb9f06d3a712e55f99ecf5ddc2c94f211596ad33bf" Mar 07 08:00:04 crc kubenswrapper[4941]: E0307 08:00:04.983525 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a29407690c4810603abca7eb9f06d3a712e55f99ecf5ddc2c94f211596ad33bf\": container with ID starting with a29407690c4810603abca7eb9f06d3a712e55f99ecf5ddc2c94f211596ad33bf not found: ID does not exist" containerID="a29407690c4810603abca7eb9f06d3a712e55f99ecf5ddc2c94f211596ad33bf" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.983544 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29407690c4810603abca7eb9f06d3a712e55f99ecf5ddc2c94f211596ad33bf"} err="failed to get container status \"a29407690c4810603abca7eb9f06d3a712e55f99ecf5ddc2c94f211596ad33bf\": rpc error: code = NotFound desc = could not find container \"a29407690c4810603abca7eb9f06d3a712e55f99ecf5ddc2c94f211596ad33bf\": container with ID starting with a29407690c4810603abca7eb9f06d3a712e55f99ecf5ddc2c94f211596ad33bf not found: ID does not exist" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.983555 4941 scope.go:117] "RemoveContainer" containerID="8f0be642bac65857ddac9b213156920005d18eb7f99bfdc383fcc5b6a3948a57" Mar 07 08:00:04 crc kubenswrapper[4941]: E0307 08:00:04.983789 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f0be642bac65857ddac9b213156920005d18eb7f99bfdc383fcc5b6a3948a57\": container with ID starting with 8f0be642bac65857ddac9b213156920005d18eb7f99bfdc383fcc5b6a3948a57 not found: ID does not exist" containerID="8f0be642bac65857ddac9b213156920005d18eb7f99bfdc383fcc5b6a3948a57" Mar 07 08:00:04 crc kubenswrapper[4941]: I0307 08:00:04.983824 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f0be642bac65857ddac9b213156920005d18eb7f99bfdc383fcc5b6a3948a57"} err="failed to get container status \"8f0be642bac65857ddac9b213156920005d18eb7f99bfdc383fcc5b6a3948a57\": rpc error: code = NotFound desc = could not find container \"8f0be642bac65857ddac9b213156920005d18eb7f99bfdc383fcc5b6a3948a57\": container with ID starting with 8f0be642bac65857ddac9b213156920005d18eb7f99bfdc383fcc5b6a3948a57 not found: ID does not exist" Mar 07 08:00:05 crc kubenswrapper[4941]: I0307 08:00:05.014460 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 08:00:05 crc kubenswrapper[4941]: I0307 08:00:05.051929 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 08:00:05 crc kubenswrapper[4941]: I0307 08:00:05.966243 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a8df16-342b-472f-b696-ea5c595954d2" path="/var/lib/kubelet/pods/d0a8df16-342b-472f-b696-ea5c595954d2/volumes" Mar 07 08:00:05 crc kubenswrapper[4941]: I0307 08:00:05.968673 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b0380d-e6d2-473f-a49b-bdccb4747ccc" path="/var/lib/kubelet/pods/e7b0380d-e6d2-473f-a49b-bdccb4747ccc/volumes" Mar 07 08:00:06 crc kubenswrapper[4941]: I0307 08:00:06.198880 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547840-hbmbh" Mar 07 08:00:06 crc kubenswrapper[4941]: I0307 08:00:06.357846 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhxrl\" (UniqueName: \"kubernetes.io/projected/da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3-kube-api-access-vhxrl\") pod \"da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3\" (UID: \"da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3\") " Mar 07 08:00:06 crc kubenswrapper[4941]: I0307 08:00:06.368091 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3-kube-api-access-vhxrl" (OuterVolumeSpecName: "kube-api-access-vhxrl") pod "da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3" (UID: "da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3"). InnerVolumeSpecName "kube-api-access-vhxrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:00:06 crc kubenswrapper[4941]: I0307 08:00:06.460172 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhxrl\" (UniqueName: \"kubernetes.io/projected/da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3-kube-api-access-vhxrl\") on node \"crc\" DevicePath \"\"" Mar 07 08:00:06 crc kubenswrapper[4941]: I0307 08:00:06.914377 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547840-hbmbh" event={"ID":"da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3","Type":"ContainerDied","Data":"e8e7893c5426d3c7c1d807e1145d3ff222a997165b033a0c89e44db9f78b7c2c"} Mar 07 08:00:06 crc kubenswrapper[4941]: I0307 08:00:06.914466 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8e7893c5426d3c7c1d807e1145d3ff222a997165b033a0c89e44db9f78b7c2c" Mar 07 08:00:06 crc kubenswrapper[4941]: I0307 08:00:06.914509 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547840-hbmbh" Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.011373 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xfqtp"] Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.011648 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xfqtp" podUID="f6d1f0bc-ce0a-48aa-b729-0d287217f619" containerName="registry-server" containerID="cri-o://68c21b0efc4b4d8bb24ca0c93f4798581eae9b8bbaff79f90f1810ffdd65603e" gracePeriod=2 Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.270028 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547834-glmmb"] Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.281693 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547834-glmmb"] Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.486250 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.578941 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6d1f0bc-ce0a-48aa-b729-0d287217f619-catalog-content\") pod \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\" (UID: \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\") " Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.578997 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tzzn\" (UniqueName: \"kubernetes.io/projected/f6d1f0bc-ce0a-48aa-b729-0d287217f619-kube-api-access-7tzzn\") pod \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\" (UID: \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\") " Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.579096 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6d1f0bc-ce0a-48aa-b729-0d287217f619-utilities\") pod \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\" (UID: \"f6d1f0bc-ce0a-48aa-b729-0d287217f619\") " Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.581013 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6d1f0bc-ce0a-48aa-b729-0d287217f619-utilities" (OuterVolumeSpecName: "utilities") pod "f6d1f0bc-ce0a-48aa-b729-0d287217f619" (UID: "f6d1f0bc-ce0a-48aa-b729-0d287217f619"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.585049 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d1f0bc-ce0a-48aa-b729-0d287217f619-kube-api-access-7tzzn" (OuterVolumeSpecName: "kube-api-access-7tzzn") pod "f6d1f0bc-ce0a-48aa-b729-0d287217f619" (UID: "f6d1f0bc-ce0a-48aa-b729-0d287217f619"). InnerVolumeSpecName "kube-api-access-7tzzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.681581 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tzzn\" (UniqueName: \"kubernetes.io/projected/f6d1f0bc-ce0a-48aa-b729-0d287217f619-kube-api-access-7tzzn\") on node \"crc\" DevicePath \"\"" Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.681643 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6d1f0bc-ce0a-48aa-b729-0d287217f619-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.781016 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6d1f0bc-ce0a-48aa-b729-0d287217f619-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6d1f0bc-ce0a-48aa-b729-0d287217f619" (UID: "f6d1f0bc-ce0a-48aa-b729-0d287217f619"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.783029 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6d1f0bc-ce0a-48aa-b729-0d287217f619-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.927975 4941 generic.go:334] "Generic (PLEG): container finished" podID="f6d1f0bc-ce0a-48aa-b729-0d287217f619" containerID="68c21b0efc4b4d8bb24ca0c93f4798581eae9b8bbaff79f90f1810ffdd65603e" exitCode=0 Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.928037 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xfqtp" Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.928068 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfqtp" event={"ID":"f6d1f0bc-ce0a-48aa-b729-0d287217f619","Type":"ContainerDied","Data":"68c21b0efc4b4d8bb24ca0c93f4798581eae9b8bbaff79f90f1810ffdd65603e"} Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.928152 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfqtp" event={"ID":"f6d1f0bc-ce0a-48aa-b729-0d287217f619","Type":"ContainerDied","Data":"da9d1cdaac40fe8d30b9cbdf4016a8625c8531adb7902f6176117e0b981c3d44"} Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.928186 4941 scope.go:117] "RemoveContainer" containerID="68c21b0efc4b4d8bb24ca0c93f4798581eae9b8bbaff79f90f1810ffdd65603e" Mar 07 08:00:07 crc kubenswrapper[4941]: I0307 08:00:07.960696 4941 scope.go:117] "RemoveContainer" containerID="ac00f22be00a8673bc696ba28791d195794aca310456e57c50b51dbb5eefd108" Mar 07 08:00:08 crc kubenswrapper[4941]: I0307 08:00:08.002372 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8610f6c7-0c60-4c3e-be92-c2c75caa6eb5" path="/var/lib/kubelet/pods/8610f6c7-0c60-4c3e-be92-c2c75caa6eb5/volumes" Mar 07 08:00:08 crc kubenswrapper[4941]: I0307 08:00:08.004263 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xfqtp"] Mar 07 08:00:08 crc kubenswrapper[4941]: I0307 08:00:08.016836 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xfqtp"] Mar 07 08:00:08 crc kubenswrapper[4941]: I0307 08:00:08.020229 4941 scope.go:117] "RemoveContainer" containerID="150a5eb7565dffb4af5ef091f941516421a407634bc05683d613d81fcf0e6c9f" Mar 07 08:00:08 crc kubenswrapper[4941]: I0307 08:00:08.042473 4941 scope.go:117] "RemoveContainer" containerID="68c21b0efc4b4d8bb24ca0c93f4798581eae9b8bbaff79f90f1810ffdd65603e" Mar 07 08:00:08 crc kubenswrapper[4941]: E0307 08:00:08.042926 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c21b0efc4b4d8bb24ca0c93f4798581eae9b8bbaff79f90f1810ffdd65603e\": container with ID starting with 68c21b0efc4b4d8bb24ca0c93f4798581eae9b8bbaff79f90f1810ffdd65603e not found: ID does not exist" containerID="68c21b0efc4b4d8bb24ca0c93f4798581eae9b8bbaff79f90f1810ffdd65603e" Mar 07 08:00:08 crc kubenswrapper[4941]: I0307 08:00:08.042967 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c21b0efc4b4d8bb24ca0c93f4798581eae9b8bbaff79f90f1810ffdd65603e"} err="failed to get container status \"68c21b0efc4b4d8bb24ca0c93f4798581eae9b8bbaff79f90f1810ffdd65603e\": rpc error: code = NotFound desc = could not find container \"68c21b0efc4b4d8bb24ca0c93f4798581eae9b8bbaff79f90f1810ffdd65603e\": container with ID starting with 68c21b0efc4b4d8bb24ca0c93f4798581eae9b8bbaff79f90f1810ffdd65603e not found: ID does not exist" Mar 07 08:00:08 crc kubenswrapper[4941]: I0307 08:00:08.042991 4941 scope.go:117] "RemoveContainer" containerID="ac00f22be00a8673bc696ba28791d195794aca310456e57c50b51dbb5eefd108" Mar 07 08:00:08 crc kubenswrapper[4941]: E0307 08:00:08.043329 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac00f22be00a8673bc696ba28791d195794aca310456e57c50b51dbb5eefd108\": container with ID starting with ac00f22be00a8673bc696ba28791d195794aca310456e57c50b51dbb5eefd108 not found: ID does not exist" containerID="ac00f22be00a8673bc696ba28791d195794aca310456e57c50b51dbb5eefd108" Mar 07 08:00:08 crc kubenswrapper[4941]: I0307 08:00:08.043388 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac00f22be00a8673bc696ba28791d195794aca310456e57c50b51dbb5eefd108"} err="failed to get container status \"ac00f22be00a8673bc696ba28791d195794aca310456e57c50b51dbb5eefd108\": rpc error: code = NotFound desc = could not find container \"ac00f22be00a8673bc696ba28791d195794aca310456e57c50b51dbb5eefd108\": container with ID starting with ac00f22be00a8673bc696ba28791d195794aca310456e57c50b51dbb5eefd108 not found: ID does not exist" Mar 07 08:00:08 crc kubenswrapper[4941]: I0307 08:00:08.043467 4941 scope.go:117] "RemoveContainer" containerID="150a5eb7565dffb4af5ef091f941516421a407634bc05683d613d81fcf0e6c9f" Mar 07 08:00:08 crc kubenswrapper[4941]: E0307 08:00:08.043808 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"150a5eb7565dffb4af5ef091f941516421a407634bc05683d613d81fcf0e6c9f\": container with ID starting with 150a5eb7565dffb4af5ef091f941516421a407634bc05683d613d81fcf0e6c9f not found: ID does not exist" containerID="150a5eb7565dffb4af5ef091f941516421a407634bc05683d613d81fcf0e6c9f" Mar 07 08:00:08 crc kubenswrapper[4941]: I0307 08:00:08.043839 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"150a5eb7565dffb4af5ef091f941516421a407634bc05683d613d81fcf0e6c9f"} err="failed to get container status \"150a5eb7565dffb4af5ef091f941516421a407634bc05683d613d81fcf0e6c9f\": rpc error: code = NotFound desc = could not find container \"150a5eb7565dffb4af5ef091f941516421a407634bc05683d613d81fcf0e6c9f\": container with ID starting with 150a5eb7565dffb4af5ef091f941516421a407634bc05683d613d81fcf0e6c9f not found: ID does not exist" Mar 07 08:00:09 crc kubenswrapper[4941]: I0307 08:00:09.884152 4941 scope.go:117] "RemoveContainer" containerID="f985600e0206a94dcd2ea6d2d74f098ea465c83af3a9dbe0fc5e496322e954aa" Mar 07 08:00:09 crc kubenswrapper[4941]: I0307 08:00:09.951786 4941 scope.go:117] "RemoveContainer" containerID="65dd08c7d05870e8d4e5cd7a8abbbafcd882e2d57e5822eebbee49d373a690f3" Mar 07 08:00:09 crc kubenswrapper[4941]: I0307 08:00:09.970224 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d1f0bc-ce0a-48aa-b729-0d287217f619" path="/var/lib/kubelet/pods/f6d1f0bc-ce0a-48aa-b729-0d287217f619/volumes" Mar 07 08:00:10 crc kubenswrapper[4941]: I0307 08:00:10.314230 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 08:00:10 crc kubenswrapper[4941]: I0307 08:00:10.314329 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 08:00:40 crc kubenswrapper[4941]: I0307 08:00:40.314708 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 08:00:40 crc kubenswrapper[4941]: I0307 08:00:40.315550 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 08:01:10 crc kubenswrapper[4941]: I0307 08:01:10.314260 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 08:01:10 crc kubenswrapper[4941]: I0307 08:01:10.314774 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 08:01:10 crc kubenswrapper[4941]: I0307 08:01:10.314826 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 08:01:10 crc kubenswrapper[4941]: I0307 08:01:10.315334 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a1d0b5eb92e31d5229449e1e28ad7dec12cd8ead59c2438d98b39bdcd41bf81a"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 08:01:10 crc kubenswrapper[4941]: I0307 08:01:10.315426 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://a1d0b5eb92e31d5229449e1e28ad7dec12cd8ead59c2438d98b39bdcd41bf81a" gracePeriod=600 Mar 07 08:01:10 crc kubenswrapper[4941]: I0307 08:01:10.498182 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="a1d0b5eb92e31d5229449e1e28ad7dec12cd8ead59c2438d98b39bdcd41bf81a" exitCode=0 Mar 07 08:01:10 crc kubenswrapper[4941]: I0307 08:01:10.498233 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"a1d0b5eb92e31d5229449e1e28ad7dec12cd8ead59c2438d98b39bdcd41bf81a"} Mar 07 08:01:10 crc kubenswrapper[4941]: I0307 08:01:10.498272 4941 scope.go:117] "RemoveContainer" containerID="17d7326dcb47d5580a1a428b16a4054881ae3ac467cb0d37addb4a598b36fe92" Mar 07 08:01:11 crc kubenswrapper[4941]: I0307 08:01:11.508620 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e"} Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.285232 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xs77s"] Mar 07 08:01:49 crc kubenswrapper[4941]: E0307 08:01:49.286655 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d1f0bc-ce0a-48aa-b729-0d287217f619" containerName="extract-utilities" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.286683 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d1f0bc-ce0a-48aa-b729-0d287217f619" containerName="extract-utilities" Mar 07 08:01:49 crc kubenswrapper[4941]: E0307 08:01:49.286700 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d1f0bc-ce0a-48aa-b729-0d287217f619" containerName="extract-content" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.286708 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d1f0bc-ce0a-48aa-b729-0d287217f619" containerName="extract-content" Mar 07 08:01:49 crc kubenswrapper[4941]: E0307 08:01:49.286724 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3" containerName="oc" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.286732 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3" containerName="oc" Mar 07 08:01:49 crc kubenswrapper[4941]: E0307 08:01:49.286747 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a8df16-342b-472f-b696-ea5c595954d2" containerName="registry-server" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.286754 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a8df16-342b-472f-b696-ea5c595954d2" containerName="registry-server" Mar 07 08:01:49 crc kubenswrapper[4941]: E0307 08:01:49.286768 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a8df16-342b-472f-b696-ea5c595954d2" containerName="extract-content" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.286775 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a8df16-342b-472f-b696-ea5c595954d2" containerName="extract-content" Mar 07 08:01:49 crc kubenswrapper[4941]: E0307 08:01:49.286791 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d00e10-29f3-4800-9974-ab73ec201798" containerName="collect-profiles" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.286799 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d00e10-29f3-4800-9974-ab73ec201798" containerName="collect-profiles" Mar 07 08:01:49 crc kubenswrapper[4941]: E0307 08:01:49.286821 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a8df16-342b-472f-b696-ea5c595954d2" containerName="extract-utilities" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.286829 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a8df16-342b-472f-b696-ea5c595954d2" containerName="extract-utilities" Mar 07 08:01:49 crc kubenswrapper[4941]: E0307 08:01:49.286846 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d1f0bc-ce0a-48aa-b729-0d287217f619" containerName="registry-server" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.286854 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d1f0bc-ce0a-48aa-b729-0d287217f619" containerName="registry-server" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.287031 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6d1f0bc-ce0a-48aa-b729-0d287217f619" containerName="registry-server" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.287054 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a8df16-342b-472f-b696-ea5c595954d2" containerName="registry-server" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.287064 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d00e10-29f3-4800-9974-ab73ec201798" containerName="collect-profiles" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.287084 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3" containerName="oc" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.289873 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.302955 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xs77s"] Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.469973 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1266f15c-f2a8-42ee-81fc-c411ee7e300d-utilities\") pod \"certified-operators-xs77s\" (UID: \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\") " pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.470128 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1266f15c-f2a8-42ee-81fc-c411ee7e300d-catalog-content\") pod \"certified-operators-xs77s\" (UID: \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\") " pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.470189 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgt82\" (UniqueName: \"kubernetes.io/projected/1266f15c-f2a8-42ee-81fc-c411ee7e300d-kube-api-access-kgt82\") pod \"certified-operators-xs77s\" (UID: \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\") " pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.571970 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1266f15c-f2a8-42ee-81fc-c411ee7e300d-utilities\") pod \"certified-operators-xs77s\" (UID: \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\") " pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.572324 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1266f15c-f2a8-42ee-81fc-c411ee7e300d-catalog-content\") pod \"certified-operators-xs77s\" (UID: \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\") " pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.572457 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgt82\" (UniqueName: \"kubernetes.io/projected/1266f15c-f2a8-42ee-81fc-c411ee7e300d-kube-api-access-kgt82\") pod \"certified-operators-xs77s\" (UID: \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\") " pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.572847 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1266f15c-f2a8-42ee-81fc-c411ee7e300d-catalog-content\") pod \"certified-operators-xs77s\" (UID: \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\") " pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.573026 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1266f15c-f2a8-42ee-81fc-c411ee7e300d-utilities\") pod \"certified-operators-xs77s\" (UID: \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\") " pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.595264 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgt82\" (UniqueName: \"kubernetes.io/projected/1266f15c-f2a8-42ee-81fc-c411ee7e300d-kube-api-access-kgt82\") pod \"certified-operators-xs77s\" (UID: \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\") " pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:01:49 crc kubenswrapper[4941]: I0307 08:01:49.639330 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:01:50 crc kubenswrapper[4941]: I0307 08:01:50.089820 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xs77s"] Mar 07 08:01:50 crc kubenswrapper[4941]: W0307 08:01:50.092473 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1266f15c_f2a8_42ee_81fc_c411ee7e300d.slice/crio-218cc76959de2b38be01c004fcf3fff04b6e14ae53a3645f423bac926bb61efd WatchSource:0}: Error finding container 218cc76959de2b38be01c004fcf3fff04b6e14ae53a3645f423bac926bb61efd: Status 404 returned error can't find the container with id 218cc76959de2b38be01c004fcf3fff04b6e14ae53a3645f423bac926bb61efd Mar 07 08:01:50 crc kubenswrapper[4941]: I0307 08:01:50.859918 4941 generic.go:334] "Generic (PLEG): container finished" podID="1266f15c-f2a8-42ee-81fc-c411ee7e300d" containerID="cfb3625a2d2a2f08313272a38a91a0815fed4a578ee8d4ef0748cc8646ef1091" exitCode=0 Mar 07 08:01:50 crc kubenswrapper[4941]: I0307 08:01:50.860190 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xs77s" event={"ID":"1266f15c-f2a8-42ee-81fc-c411ee7e300d","Type":"ContainerDied","Data":"cfb3625a2d2a2f08313272a38a91a0815fed4a578ee8d4ef0748cc8646ef1091"} Mar 07 08:01:50 crc kubenswrapper[4941]: I0307 08:01:50.860231 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xs77s" event={"ID":"1266f15c-f2a8-42ee-81fc-c411ee7e300d","Type":"ContainerStarted","Data":"218cc76959de2b38be01c004fcf3fff04b6e14ae53a3645f423bac926bb61efd"} Mar 07 08:01:51 crc kubenswrapper[4941]: I0307 08:01:51.877726 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xs77s" event={"ID":"1266f15c-f2a8-42ee-81fc-c411ee7e300d","Type":"ContainerStarted","Data":"2caac12359f78aca02c38227bc66911e872249a581d6f92a6ce5aef06d67c886"} Mar 07 08:01:52 crc kubenswrapper[4941]: I0307 08:01:52.889424 4941 generic.go:334] "Generic (PLEG): container finished" podID="1266f15c-f2a8-42ee-81fc-c411ee7e300d" containerID="2caac12359f78aca02c38227bc66911e872249a581d6f92a6ce5aef06d67c886" exitCode=0 Mar 07 08:01:52 crc kubenswrapper[4941]: I0307 08:01:52.889524 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xs77s" event={"ID":"1266f15c-f2a8-42ee-81fc-c411ee7e300d","Type":"ContainerDied","Data":"2caac12359f78aca02c38227bc66911e872249a581d6f92a6ce5aef06d67c886"} Mar 07 08:01:54 crc kubenswrapper[4941]: I0307 08:01:54.909532 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xs77s" event={"ID":"1266f15c-f2a8-42ee-81fc-c411ee7e300d","Type":"ContainerStarted","Data":"44f4d5ca275bc001b060f6ca5e98778e0ba7c32ac2b0b84bc45d7a39d546b81b"} Mar 07 08:01:54 crc kubenswrapper[4941]: I0307 08:01:54.941543 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xs77s" podStartSLOduration=2.864025047 podStartE2EDuration="5.941516976s" podCreationTimestamp="2026-03-07 08:01:49 +0000 UTC" firstStartedPulling="2026-03-07 08:01:50.862668195 +0000 UTC m=+4207.815033700" lastFinishedPulling="2026-03-07 08:01:53.940160144 +0000 UTC m=+4210.892525629" observedRunningTime="2026-03-07 08:01:54.931221238 +0000 UTC m=+4211.883586773" watchObservedRunningTime="2026-03-07 08:01:54.941516976 +0000 UTC m=+4211.893882481" Mar 07 08:01:59 crc kubenswrapper[4941]: I0307 08:01:59.639803 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:01:59 crc kubenswrapper[4941]: I0307 08:01:59.640620 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:01:59 crc kubenswrapper[4941]: I0307 08:01:59.722687 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:02:00 crc kubenswrapper[4941]: I0307 08:02:00.017949 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:02:00 crc kubenswrapper[4941]: I0307 08:02:00.071326 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xs77s"] Mar 07 08:02:00 crc kubenswrapper[4941]: I0307 08:02:00.161113 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547842-n4pcb"] Mar 07 08:02:00 crc kubenswrapper[4941]: I0307 08:02:00.162298 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547842-n4pcb" Mar 07 08:02:00 crc kubenswrapper[4941]: I0307 08:02:00.166264 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 08:02:00 crc kubenswrapper[4941]: I0307 08:02:00.166903 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 08:02:00 crc kubenswrapper[4941]: I0307 08:02:00.167530 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 08:02:00 crc kubenswrapper[4941]: I0307 08:02:00.167813 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547842-n4pcb"] Mar 07 08:02:00 crc kubenswrapper[4941]: I0307 08:02:00.203917 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q67s\" (UniqueName: \"kubernetes.io/projected/1aa4b3ca-db53-46ff-bae2-e878e699b64d-kube-api-access-2q67s\") pod \"auto-csr-approver-29547842-n4pcb\" (UID: \"1aa4b3ca-db53-46ff-bae2-e878e699b64d\") " pod="openshift-infra/auto-csr-approver-29547842-n4pcb" Mar 07 08:02:00 crc kubenswrapper[4941]: I0307 08:02:00.306145 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q67s\" (UniqueName: \"kubernetes.io/projected/1aa4b3ca-db53-46ff-bae2-e878e699b64d-kube-api-access-2q67s\") pod \"auto-csr-approver-29547842-n4pcb\" (UID: \"1aa4b3ca-db53-46ff-bae2-e878e699b64d\") " pod="openshift-infra/auto-csr-approver-29547842-n4pcb" Mar 07 08:02:00 crc kubenswrapper[4941]: I0307 08:02:00.381378 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q67s\" (UniqueName: \"kubernetes.io/projected/1aa4b3ca-db53-46ff-bae2-e878e699b64d-kube-api-access-2q67s\") pod \"auto-csr-approver-29547842-n4pcb\" (UID: \"1aa4b3ca-db53-46ff-bae2-e878e699b64d\") " pod="openshift-infra/auto-csr-approver-29547842-n4pcb" Mar 07 08:02:00 crc kubenswrapper[4941]: I0307 08:02:00.496229 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547842-n4pcb" Mar 07 08:02:01 crc kubenswrapper[4941]: I0307 08:02:01.037220 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547842-n4pcb"] Mar 07 08:02:01 crc kubenswrapper[4941]: I0307 08:02:01.978230 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xs77s" podUID="1266f15c-f2a8-42ee-81fc-c411ee7e300d" containerName="registry-server" containerID="cri-o://44f4d5ca275bc001b060f6ca5e98778e0ba7c32ac2b0b84bc45d7a39d546b81b" gracePeriod=2 Mar 07 08:02:01 crc kubenswrapper[4941]: I0307 08:02:01.978604 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547842-n4pcb" event={"ID":"1aa4b3ca-db53-46ff-bae2-e878e699b64d","Type":"ContainerStarted","Data":"b4d9532fd58404941f868a3a9e031661389f4a66249bf7bb8a34f00681db5bc3"} Mar 07 08:02:02 crc kubenswrapper[4941]: I0307 08:02:02.989018 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547842-n4pcb" event={"ID":"1aa4b3ca-db53-46ff-bae2-e878e699b64d","Type":"ContainerStarted","Data":"750999ffbe8c014cc77dc6a1986dfc4d3a05ce8e9a34ee1a1d36bad99eee6318"} Mar 07 08:02:02 crc kubenswrapper[4941]: I0307 08:02:02.991460 4941 generic.go:334] "Generic (PLEG): container finished" podID="1266f15c-f2a8-42ee-81fc-c411ee7e300d" containerID="44f4d5ca275bc001b060f6ca5e98778e0ba7c32ac2b0b84bc45d7a39d546b81b" exitCode=0 Mar 07 08:02:02 crc kubenswrapper[4941]: I0307 08:02:02.991502 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xs77s" event={"ID":"1266f15c-f2a8-42ee-81fc-c411ee7e300d","Type":"ContainerDied","Data":"44f4d5ca275bc001b060f6ca5e98778e0ba7c32ac2b0b84bc45d7a39d546b81b"} Mar 07 08:02:03 crc kubenswrapper[4941]: I0307 08:02:03.004531 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547842-n4pcb" podStartSLOduration=2.045495996 podStartE2EDuration="3.004508888s" podCreationTimestamp="2026-03-07 08:02:00 +0000 UTC" firstStartedPulling="2026-03-07 08:02:01.05154169 +0000 UTC m=+4218.003907165" lastFinishedPulling="2026-03-07 08:02:02.010554552 +0000 UTC m=+4218.962920057" observedRunningTime="2026-03-07 08:02:03.001014411 +0000 UTC m=+4219.953379876" watchObservedRunningTime="2026-03-07 08:02:03.004508888 +0000 UTC m=+4219.956874363" Mar 07 08:02:03 crc kubenswrapper[4941]: I0307 08:02:03.690515 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:02:03 crc kubenswrapper[4941]: I0307 08:02:03.762282 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1266f15c-f2a8-42ee-81fc-c411ee7e300d-catalog-content\") pod \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\" (UID: \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\") " Mar 07 08:02:03 crc kubenswrapper[4941]: I0307 08:02:03.762358 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgt82\" (UniqueName: \"kubernetes.io/projected/1266f15c-f2a8-42ee-81fc-c411ee7e300d-kube-api-access-kgt82\") pod \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\" (UID: \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\") " Mar 07 08:02:03 crc kubenswrapper[4941]: I0307 08:02:03.762386 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1266f15c-f2a8-42ee-81fc-c411ee7e300d-utilities\") pod \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\" (UID: \"1266f15c-f2a8-42ee-81fc-c411ee7e300d\") " Mar 07 08:02:03 crc kubenswrapper[4941]: I0307 08:02:03.763691 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1266f15c-f2a8-42ee-81fc-c411ee7e300d-utilities" (OuterVolumeSpecName: "utilities") pod "1266f15c-f2a8-42ee-81fc-c411ee7e300d" (UID: "1266f15c-f2a8-42ee-81fc-c411ee7e300d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 08:02:03 crc kubenswrapper[4941]: I0307 08:02:03.768060 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1266f15c-f2a8-42ee-81fc-c411ee7e300d-kube-api-access-kgt82" (OuterVolumeSpecName: "kube-api-access-kgt82") pod "1266f15c-f2a8-42ee-81fc-c411ee7e300d" (UID: "1266f15c-f2a8-42ee-81fc-c411ee7e300d"). InnerVolumeSpecName "kube-api-access-kgt82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:02:03 crc kubenswrapper[4941]: I0307 08:02:03.823459 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1266f15c-f2a8-42ee-81fc-c411ee7e300d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1266f15c-f2a8-42ee-81fc-c411ee7e300d" (UID: "1266f15c-f2a8-42ee-81fc-c411ee7e300d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 08:02:03 crc kubenswrapper[4941]: I0307 08:02:03.864016 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgt82\" (UniqueName: \"kubernetes.io/projected/1266f15c-f2a8-42ee-81fc-c411ee7e300d-kube-api-access-kgt82\") on node \"crc\" DevicePath \"\"" Mar 07 08:02:03 crc kubenswrapper[4941]: I0307 08:02:03.864047 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1266f15c-f2a8-42ee-81fc-c411ee7e300d-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 08:02:03 crc kubenswrapper[4941]: I0307 08:02:03.864058 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1266f15c-f2a8-42ee-81fc-c411ee7e300d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 08:02:04 crc kubenswrapper[4941]: I0307 08:02:04.000846 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xs77s" event={"ID":"1266f15c-f2a8-42ee-81fc-c411ee7e300d","Type":"ContainerDied","Data":"218cc76959de2b38be01c004fcf3fff04b6e14ae53a3645f423bac926bb61efd"} Mar 07 08:02:04 crc kubenswrapper[4941]: I0307 08:02:04.000997 4941 scope.go:117] "RemoveContainer" containerID="44f4d5ca275bc001b060f6ca5e98778e0ba7c32ac2b0b84bc45d7a39d546b81b" Mar 07 08:02:04 crc kubenswrapper[4941]: I0307 08:02:04.001115 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xs77s" Mar 07 08:02:04 crc kubenswrapper[4941]: I0307 08:02:04.007929 4941 generic.go:334] "Generic (PLEG): container finished" podID="1aa4b3ca-db53-46ff-bae2-e878e699b64d" containerID="750999ffbe8c014cc77dc6a1986dfc4d3a05ce8e9a34ee1a1d36bad99eee6318" exitCode=0 Mar 07 08:02:04 crc kubenswrapper[4941]: I0307 08:02:04.007998 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547842-n4pcb" event={"ID":"1aa4b3ca-db53-46ff-bae2-e878e699b64d","Type":"ContainerDied","Data":"750999ffbe8c014cc77dc6a1986dfc4d3a05ce8e9a34ee1a1d36bad99eee6318"} Mar 07 08:02:04 crc kubenswrapper[4941]: I0307 08:02:04.035674 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xs77s"] Mar 07 08:02:04 crc kubenswrapper[4941]: I0307 08:02:04.046902 4941 scope.go:117] "RemoveContainer" containerID="2caac12359f78aca02c38227bc66911e872249a581d6f92a6ce5aef06d67c886" Mar 07 08:02:04 crc kubenswrapper[4941]: I0307 08:02:04.051038 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xs77s"] Mar 07 08:02:04 crc kubenswrapper[4941]: I0307 08:02:04.065137 4941 scope.go:117] "RemoveContainer" containerID="cfb3625a2d2a2f08313272a38a91a0815fed4a578ee8d4ef0748cc8646ef1091" Mar 07 08:02:05 crc kubenswrapper[4941]: I0307 08:02:05.331946 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547842-n4pcb" Mar 07 08:02:05 crc kubenswrapper[4941]: I0307 08:02:05.384668 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q67s\" (UniqueName: \"kubernetes.io/projected/1aa4b3ca-db53-46ff-bae2-e878e699b64d-kube-api-access-2q67s\") pod \"1aa4b3ca-db53-46ff-bae2-e878e699b64d\" (UID: \"1aa4b3ca-db53-46ff-bae2-e878e699b64d\") " Mar 07 08:02:05 crc kubenswrapper[4941]: I0307 08:02:05.390388 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa4b3ca-db53-46ff-bae2-e878e699b64d-kube-api-access-2q67s" (OuterVolumeSpecName: "kube-api-access-2q67s") pod "1aa4b3ca-db53-46ff-bae2-e878e699b64d" (UID: "1aa4b3ca-db53-46ff-bae2-e878e699b64d"). InnerVolumeSpecName "kube-api-access-2q67s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:02:05 crc kubenswrapper[4941]: I0307 08:02:05.486605 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q67s\" (UniqueName: \"kubernetes.io/projected/1aa4b3ca-db53-46ff-bae2-e878e699b64d-kube-api-access-2q67s\") on node \"crc\" DevicePath \"\"" Mar 07 08:02:05 crc kubenswrapper[4941]: I0307 08:02:05.965238 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1266f15c-f2a8-42ee-81fc-c411ee7e300d" path="/var/lib/kubelet/pods/1266f15c-f2a8-42ee-81fc-c411ee7e300d/volumes" Mar 07 08:02:06 crc kubenswrapper[4941]: I0307 08:02:06.029172 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547842-n4pcb" event={"ID":"1aa4b3ca-db53-46ff-bae2-e878e699b64d","Type":"ContainerDied","Data":"b4d9532fd58404941f868a3a9e031661389f4a66249bf7bb8a34f00681db5bc3"} Mar 07 08:02:06 crc kubenswrapper[4941]: I0307 08:02:06.029218 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4d9532fd58404941f868a3a9e031661389f4a66249bf7bb8a34f00681db5bc3" Mar 07 08:02:06 crc kubenswrapper[4941]: I0307 08:02:06.029667 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547842-n4pcb" Mar 07 08:02:06 crc kubenswrapper[4941]: I0307 08:02:06.083169 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547836-v2xgb"] Mar 07 08:02:06 crc kubenswrapper[4941]: I0307 08:02:06.094683 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547836-v2xgb"] Mar 07 08:02:07 crc kubenswrapper[4941]: I0307 08:02:07.969990 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11eb1bfb-3396-4b0d-a07a-55ff9946e2ed" path="/var/lib/kubelet/pods/11eb1bfb-3396-4b0d-a07a-55ff9946e2ed/volumes" Mar 07 08:02:10 crc kubenswrapper[4941]: I0307 08:02:10.102962 4941 scope.go:117] "RemoveContainer" containerID="eb3e73d76f1ebd9c5862cca54e476593592fbf0cba2a9406c254257e887179c3" Mar 07 08:03:10 crc kubenswrapper[4941]: I0307 08:03:10.314805 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 08:03:10 crc kubenswrapper[4941]: I0307 08:03:10.315367 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 08:03:40 crc kubenswrapper[4941]: I0307 08:03:40.314399 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 08:03:40 crc kubenswrapper[4941]: I0307 08:03:40.315134 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.147674 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547844-4ksm7"] Mar 07 08:04:00 crc kubenswrapper[4941]: E0307 08:04:00.148495 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa4b3ca-db53-46ff-bae2-e878e699b64d" containerName="oc" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.148508 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa4b3ca-db53-46ff-bae2-e878e699b64d" containerName="oc" Mar 07 08:04:00 crc kubenswrapper[4941]: E0307 08:04:00.148522 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1266f15c-f2a8-42ee-81fc-c411ee7e300d" containerName="extract-content" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.148528 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1266f15c-f2a8-42ee-81fc-c411ee7e300d" containerName="extract-content" Mar 07 08:04:00 crc kubenswrapper[4941]: E0307 08:04:00.148544 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1266f15c-f2a8-42ee-81fc-c411ee7e300d" containerName="extract-utilities" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.148550 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1266f15c-f2a8-42ee-81fc-c411ee7e300d" containerName="extract-utilities" Mar 07 08:04:00 crc kubenswrapper[4941]: E0307 08:04:00.148561 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1266f15c-f2a8-42ee-81fc-c411ee7e300d" containerName="registry-server" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.148568 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="1266f15c-f2a8-42ee-81fc-c411ee7e300d" containerName="registry-server" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.148687 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="1266f15c-f2a8-42ee-81fc-c411ee7e300d" containerName="registry-server" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.148702 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa4b3ca-db53-46ff-bae2-e878e699b64d" containerName="oc" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.149201 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547844-4ksm7" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.151866 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.152277 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.152370 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.163882 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547844-4ksm7"] Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.325723 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk4j2\" (UniqueName: \"kubernetes.io/projected/af9b7921-0e74-4196-ba1f-c488ddf24b10-kube-api-access-gk4j2\") pod \"auto-csr-approver-29547844-4ksm7\" (UID: \"af9b7921-0e74-4196-ba1f-c488ddf24b10\") " pod="openshift-infra/auto-csr-approver-29547844-4ksm7" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.426500 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk4j2\" (UniqueName: \"kubernetes.io/projected/af9b7921-0e74-4196-ba1f-c488ddf24b10-kube-api-access-gk4j2\") pod \"auto-csr-approver-29547844-4ksm7\" (UID: \"af9b7921-0e74-4196-ba1f-c488ddf24b10\") " pod="openshift-infra/auto-csr-approver-29547844-4ksm7" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.450309 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk4j2\" (UniqueName: \"kubernetes.io/projected/af9b7921-0e74-4196-ba1f-c488ddf24b10-kube-api-access-gk4j2\") pod \"auto-csr-approver-29547844-4ksm7\" (UID: \"af9b7921-0e74-4196-ba1f-c488ddf24b10\") " pod="openshift-infra/auto-csr-approver-29547844-4ksm7" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.466006 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547844-4ksm7" Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.920416 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547844-4ksm7"] Mar 07 08:04:00 crc kubenswrapper[4941]: I0307 08:04:00.976748 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547844-4ksm7" event={"ID":"af9b7921-0e74-4196-ba1f-c488ddf24b10","Type":"ContainerStarted","Data":"b8bd6264b49fce1691bbfdcb5259bf3223067783ac207ab90ee038973c4235f7"} Mar 07 08:04:02 crc kubenswrapper[4941]: I0307 08:04:02.993627 4941 generic.go:334] "Generic (PLEG): container finished" podID="af9b7921-0e74-4196-ba1f-c488ddf24b10" containerID="ed383bab6c6642898a4591825b116d1ed82c696e12faf013cf3abd1d4a602cc8" exitCode=0 Mar 07 08:04:02 crc kubenswrapper[4941]: I0307 08:04:02.993713 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547844-4ksm7" event={"ID":"af9b7921-0e74-4196-ba1f-c488ddf24b10","Type":"ContainerDied","Data":"ed383bab6c6642898a4591825b116d1ed82c696e12faf013cf3abd1d4a602cc8"} Mar 07 08:04:04 crc kubenswrapper[4941]: I0307 08:04:04.314284 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547844-4ksm7" Mar 07 08:04:04 crc kubenswrapper[4941]: I0307 08:04:04.497981 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk4j2\" (UniqueName: \"kubernetes.io/projected/af9b7921-0e74-4196-ba1f-c488ddf24b10-kube-api-access-gk4j2\") pod \"af9b7921-0e74-4196-ba1f-c488ddf24b10\" (UID: \"af9b7921-0e74-4196-ba1f-c488ddf24b10\") " Mar 07 08:04:04 crc kubenswrapper[4941]: I0307 08:04:04.503955 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af9b7921-0e74-4196-ba1f-c488ddf24b10-kube-api-access-gk4j2" (OuterVolumeSpecName: "kube-api-access-gk4j2") pod "af9b7921-0e74-4196-ba1f-c488ddf24b10" (UID: "af9b7921-0e74-4196-ba1f-c488ddf24b10"). InnerVolumeSpecName "kube-api-access-gk4j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:04:04 crc kubenswrapper[4941]: I0307 08:04:04.599418 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk4j2\" (UniqueName: \"kubernetes.io/projected/af9b7921-0e74-4196-ba1f-c488ddf24b10-kube-api-access-gk4j2\") on node \"crc\" DevicePath \"\"" Mar 07 08:04:05 crc kubenswrapper[4941]: I0307 08:04:05.010236 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547844-4ksm7" event={"ID":"af9b7921-0e74-4196-ba1f-c488ddf24b10","Type":"ContainerDied","Data":"b8bd6264b49fce1691bbfdcb5259bf3223067783ac207ab90ee038973c4235f7"} Mar 07 08:04:05 crc kubenswrapper[4941]: I0307 08:04:05.010307 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547844-4ksm7" Mar 07 08:04:05 crc kubenswrapper[4941]: I0307 08:04:05.010330 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8bd6264b49fce1691bbfdcb5259bf3223067783ac207ab90ee038973c4235f7" Mar 07 08:04:05 crc kubenswrapper[4941]: I0307 08:04:05.394851 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547838-dz7hs"] Mar 07 08:04:05 crc kubenswrapper[4941]: I0307 08:04:05.399252 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547838-dz7hs"] Mar 07 08:04:05 crc kubenswrapper[4941]: I0307 08:04:05.969147 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fef60d16-b229-4718-8438-d39698de0607" path="/var/lib/kubelet/pods/fef60d16-b229-4718-8438-d39698de0607/volumes" Mar 07 08:04:10 crc kubenswrapper[4941]: I0307 08:04:10.230809 4941 scope.go:117] "RemoveContainer" containerID="d4173fa55db3f5762a8f5d08e6debfa06e56eaec69b92f34c0f286a8b833008e" Mar 07 08:04:10 crc kubenswrapper[4941]: I0307 08:04:10.314975 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 08:04:10 crc kubenswrapper[4941]: I0307 08:04:10.315371 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 08:04:10 crc kubenswrapper[4941]: I0307 08:04:10.315483 4941 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" Mar 07 08:04:10 crc kubenswrapper[4941]: I0307 08:04:10.316177 4941 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e"} pod="openshift-machine-config-operator/machine-config-daemon-knkqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 08:04:10 crc kubenswrapper[4941]: I0307 08:04:10.316257 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" containerID="cri-o://504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" gracePeriod=600 Mar 07 08:04:11 crc kubenswrapper[4941]: I0307 08:04:11.065287 4941 generic.go:334] "Generic (PLEG): container finished" podID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" exitCode=0 Mar 07 08:04:11 crc kubenswrapper[4941]: I0307 08:04:11.065361 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerDied","Data":"504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e"} Mar 07 08:04:11 crc kubenswrapper[4941]: I0307 08:04:11.065489 4941 scope.go:117] "RemoveContainer" containerID="a1d0b5eb92e31d5229449e1e28ad7dec12cd8ead59c2438d98b39bdcd41bf81a" Mar 07 08:04:11 crc kubenswrapper[4941]: E0307 08:04:11.307187 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:04:12 crc kubenswrapper[4941]: I0307 08:04:12.078810 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:04:12 crc kubenswrapper[4941]: E0307 08:04:12.079829 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:04:23 crc kubenswrapper[4941]: I0307 08:04:23.962905 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:04:23 crc kubenswrapper[4941]: E0307 08:04:23.963708 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:04:36 crc kubenswrapper[4941]: I0307 08:04:36.953964 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:04:36 crc kubenswrapper[4941]: E0307 08:04:36.954533 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:04:50 crc kubenswrapper[4941]: I0307 08:04:50.955579 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:04:50 crc kubenswrapper[4941]: E0307 08:04:50.956820 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:05:04 crc kubenswrapper[4941]: I0307 08:05:04.954961 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:05:04 crc kubenswrapper[4941]: E0307 08:05:04.955862 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:05:15 crc kubenswrapper[4941]: I0307 08:05:15.954854 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:05:15 crc kubenswrapper[4941]: E0307 08:05:15.955864 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:05:28 crc kubenswrapper[4941]: I0307 08:05:28.955735 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:05:28 crc kubenswrapper[4941]: E0307 08:05:28.956886 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.067449 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-krlnv/must-gather-fwwrw"] Mar 07 08:05:37 crc kubenswrapper[4941]: E0307 08:05:37.068328 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9b7921-0e74-4196-ba1f-c488ddf24b10" containerName="oc" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.068345 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9b7921-0e74-4196-ba1f-c488ddf24b10" containerName="oc" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.068557 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="af9b7921-0e74-4196-ba1f-c488ddf24b10" containerName="oc" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.069477 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-krlnv/must-gather-fwwrw" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.073025 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-krlnv"/"openshift-service-ca.crt" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.073387 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-krlnv"/"kube-root-ca.crt" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.106772 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-krlnv/must-gather-fwwrw"] Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.253049 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mxgt\" (UniqueName: \"kubernetes.io/projected/55fc6e25-cfc9-49b7-a641-bfa456c35640-kube-api-access-2mxgt\") pod \"must-gather-fwwrw\" (UID: \"55fc6e25-cfc9-49b7-a641-bfa456c35640\") " pod="openshift-must-gather-krlnv/must-gather-fwwrw" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.253367 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/55fc6e25-cfc9-49b7-a641-bfa456c35640-must-gather-output\") pod \"must-gather-fwwrw\" (UID: \"55fc6e25-cfc9-49b7-a641-bfa456c35640\") " pod="openshift-must-gather-krlnv/must-gather-fwwrw" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.354729 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mxgt\" (UniqueName: \"kubernetes.io/projected/55fc6e25-cfc9-49b7-a641-bfa456c35640-kube-api-access-2mxgt\") pod \"must-gather-fwwrw\" (UID: \"55fc6e25-cfc9-49b7-a641-bfa456c35640\") " pod="openshift-must-gather-krlnv/must-gather-fwwrw" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.354787 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/55fc6e25-cfc9-49b7-a641-bfa456c35640-must-gather-output\") pod \"must-gather-fwwrw\" (UID: \"55fc6e25-cfc9-49b7-a641-bfa456c35640\") " pod="openshift-must-gather-krlnv/must-gather-fwwrw" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.355266 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/55fc6e25-cfc9-49b7-a641-bfa456c35640-must-gather-output\") pod \"must-gather-fwwrw\" (UID: \"55fc6e25-cfc9-49b7-a641-bfa456c35640\") " pod="openshift-must-gather-krlnv/must-gather-fwwrw" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.379250 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mxgt\" (UniqueName: \"kubernetes.io/projected/55fc6e25-cfc9-49b7-a641-bfa456c35640-kube-api-access-2mxgt\") pod \"must-gather-fwwrw\" (UID: \"55fc6e25-cfc9-49b7-a641-bfa456c35640\") " pod="openshift-must-gather-krlnv/must-gather-fwwrw" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.387646 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-krlnv/must-gather-fwwrw" Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.926509 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-krlnv/must-gather-fwwrw"] Mar 07 08:05:37 crc kubenswrapper[4941]: I0307 08:05:37.936072 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 08:05:38 crc kubenswrapper[4941]: I0307 08:05:38.735331 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-krlnv/must-gather-fwwrw" event={"ID":"55fc6e25-cfc9-49b7-a641-bfa456c35640","Type":"ContainerStarted","Data":"81844c1baebd54af662ba8688c974a15e5a84901124198f4da94a8c1cb712c32"} Mar 07 08:05:39 crc kubenswrapper[4941]: I0307 08:05:39.960134 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:05:39 crc kubenswrapper[4941]: E0307 08:05:39.960308 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:05:44 crc kubenswrapper[4941]: I0307 08:05:44.776707 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-krlnv/must-gather-fwwrw" event={"ID":"55fc6e25-cfc9-49b7-a641-bfa456c35640","Type":"ContainerStarted","Data":"7fcab115a2e738b363a23f51ff55124c0b7707bcfdf3d47dc31ddb584eff7044"} Mar 07 08:05:44 crc kubenswrapper[4941]: I0307 08:05:44.777351 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-krlnv/must-gather-fwwrw" event={"ID":"55fc6e25-cfc9-49b7-a641-bfa456c35640","Type":"ContainerStarted","Data":"5d7f890f23e236268ac6755cdfa73cb018b3ea679f2e4ab39a1b247fd4200dd2"} Mar 07 08:05:53 crc kubenswrapper[4941]: I0307 08:05:53.959452 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:05:53 crc kubenswrapper[4941]: E0307 08:05:53.960548 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:06:00 crc kubenswrapper[4941]: I0307 08:06:00.136121 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-krlnv/must-gather-fwwrw" podStartSLOduration=16.930226601 podStartE2EDuration="23.136097843s" podCreationTimestamp="2026-03-07 08:05:37 +0000 UTC" firstStartedPulling="2026-03-07 08:05:37.935995743 +0000 UTC m=+4434.888361208" lastFinishedPulling="2026-03-07 08:05:44.141866985 +0000 UTC m=+4441.094232450" observedRunningTime="2026-03-07 08:05:44.80105429 +0000 UTC m=+4441.753419775" watchObservedRunningTime="2026-03-07 08:06:00.136097843 +0000 UTC m=+4457.088463328" Mar 07 08:06:00 crc kubenswrapper[4941]: I0307 08:06:00.144812 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547846-kt4cw"] Mar 07 08:06:00 crc kubenswrapper[4941]: I0307 08:06:00.145923 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547846-kt4cw" Mar 07 08:06:00 crc kubenswrapper[4941]: I0307 08:06:00.151525 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 08:06:00 crc kubenswrapper[4941]: I0307 08:06:00.151710 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 08:06:00 crc kubenswrapper[4941]: I0307 08:06:00.151784 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 08:06:00 crc kubenswrapper[4941]: I0307 08:06:00.158311 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547846-kt4cw"] Mar 07 08:06:00 crc kubenswrapper[4941]: I0307 08:06:00.319773 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9fln\" (UniqueName: \"kubernetes.io/projected/b46707f7-6b0d-43e9-a652-c17c984f4ccf-kube-api-access-z9fln\") pod \"auto-csr-approver-29547846-kt4cw\" (UID: \"b46707f7-6b0d-43e9-a652-c17c984f4ccf\") " pod="openshift-infra/auto-csr-approver-29547846-kt4cw" Mar 07 08:06:00 crc kubenswrapper[4941]: I0307 08:06:00.421086 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9fln\" (UniqueName: \"kubernetes.io/projected/b46707f7-6b0d-43e9-a652-c17c984f4ccf-kube-api-access-z9fln\") pod \"auto-csr-approver-29547846-kt4cw\" (UID: \"b46707f7-6b0d-43e9-a652-c17c984f4ccf\") " pod="openshift-infra/auto-csr-approver-29547846-kt4cw" Mar 07 08:06:00 crc kubenswrapper[4941]: I0307 08:06:00.447028 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9fln\" (UniqueName: \"kubernetes.io/projected/b46707f7-6b0d-43e9-a652-c17c984f4ccf-kube-api-access-z9fln\") pod \"auto-csr-approver-29547846-kt4cw\" (UID: \"b46707f7-6b0d-43e9-a652-c17c984f4ccf\") " pod="openshift-infra/auto-csr-approver-29547846-kt4cw" Mar 07 08:06:00 crc kubenswrapper[4941]: I0307 08:06:00.465486 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547846-kt4cw" Mar 07 08:06:00 crc kubenswrapper[4941]: I0307 08:06:00.874244 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547846-kt4cw"] Mar 07 08:06:00 crc kubenswrapper[4941]: W0307 08:06:00.882477 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb46707f7_6b0d_43e9_a652_c17c984f4ccf.slice/crio-162490310cebbe31abc0aba8a3e878f8c79eb0e4e30045499d136014b0aa6ad8 WatchSource:0}: Error finding container 162490310cebbe31abc0aba8a3e878f8c79eb0e4e30045499d136014b0aa6ad8: Status 404 returned error can't find the container with id 162490310cebbe31abc0aba8a3e878f8c79eb0e4e30045499d136014b0aa6ad8 Mar 07 08:06:01 crc kubenswrapper[4941]: I0307 08:06:01.881890 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547846-kt4cw" event={"ID":"b46707f7-6b0d-43e9-a652-c17c984f4ccf","Type":"ContainerStarted","Data":"162490310cebbe31abc0aba8a3e878f8c79eb0e4e30045499d136014b0aa6ad8"} Mar 07 08:06:02 crc kubenswrapper[4941]: I0307 08:06:02.892286 4941 generic.go:334] "Generic (PLEG): container finished" podID="b46707f7-6b0d-43e9-a652-c17c984f4ccf" containerID="12c58124dcfa9d5c8f44e7085ba9dafd9ee1f47dfe925bd227fe6b6676a290fd" exitCode=0 Mar 07 08:06:02 crc kubenswrapper[4941]: I0307 08:06:02.892338 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547846-kt4cw" event={"ID":"b46707f7-6b0d-43e9-a652-c17c984f4ccf","Type":"ContainerDied","Data":"12c58124dcfa9d5c8f44e7085ba9dafd9ee1f47dfe925bd227fe6b6676a290fd"} Mar 07 08:06:04 crc kubenswrapper[4941]: I0307 08:06:04.127756 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547846-kt4cw" Mar 07 08:06:04 crc kubenswrapper[4941]: I0307 08:06:04.176180 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9fln\" (UniqueName: \"kubernetes.io/projected/b46707f7-6b0d-43e9-a652-c17c984f4ccf-kube-api-access-z9fln\") pod \"b46707f7-6b0d-43e9-a652-c17c984f4ccf\" (UID: \"b46707f7-6b0d-43e9-a652-c17c984f4ccf\") " Mar 07 08:06:04 crc kubenswrapper[4941]: I0307 08:06:04.181648 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b46707f7-6b0d-43e9-a652-c17c984f4ccf-kube-api-access-z9fln" (OuterVolumeSpecName: "kube-api-access-z9fln") pod "b46707f7-6b0d-43e9-a652-c17c984f4ccf" (UID: "b46707f7-6b0d-43e9-a652-c17c984f4ccf"). InnerVolumeSpecName "kube-api-access-z9fln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:06:04 crc kubenswrapper[4941]: I0307 08:06:04.278391 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9fln\" (UniqueName: \"kubernetes.io/projected/b46707f7-6b0d-43e9-a652-c17c984f4ccf-kube-api-access-z9fln\") on node \"crc\" DevicePath \"\"" Mar 07 08:06:04 crc kubenswrapper[4941]: I0307 08:06:04.909319 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547846-kt4cw" event={"ID":"b46707f7-6b0d-43e9-a652-c17c984f4ccf","Type":"ContainerDied","Data":"162490310cebbe31abc0aba8a3e878f8c79eb0e4e30045499d136014b0aa6ad8"} Mar 07 08:06:04 crc kubenswrapper[4941]: I0307 08:06:04.909974 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="162490310cebbe31abc0aba8a3e878f8c79eb0e4e30045499d136014b0aa6ad8" Mar 07 08:06:04 crc kubenswrapper[4941]: I0307 08:06:04.909421 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547846-kt4cw" Mar 07 08:06:05 crc kubenswrapper[4941]: I0307 08:06:05.206470 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547840-hbmbh"] Mar 07 08:06:05 crc kubenswrapper[4941]: I0307 08:06:05.214517 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547840-hbmbh"] Mar 07 08:06:05 crc kubenswrapper[4941]: I0307 08:06:05.954052 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:06:05 crc kubenswrapper[4941]: E0307 08:06:05.954679 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:06:05 crc kubenswrapper[4941]: I0307 08:06:05.963803 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3" path="/var/lib/kubelet/pods/da4abbf1-9c5b-479c-bfc6-5e8cf7da16a3/volumes" Mar 07 08:06:10 crc kubenswrapper[4941]: I0307 08:06:10.324779 4941 scope.go:117] "RemoveContainer" containerID="553617ba9c1cf8052d5a417a00a0ad25601358b956365d56911a9d0d23cba7dd" Mar 07 08:06:16 crc kubenswrapper[4941]: I0307 08:06:16.955133 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:06:16 crc kubenswrapper[4941]: E0307 08:06:16.955883 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:06:31 crc kubenswrapper[4941]: I0307 08:06:31.954377 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:06:31 crc kubenswrapper[4941]: E0307 08:06:31.955227 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:06:43 crc kubenswrapper[4941]: I0307 08:06:43.958613 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:06:43 crc kubenswrapper[4941]: E0307 08:06:43.959927 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:06:47 crc kubenswrapper[4941]: I0307 08:06:47.666029 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5_8a078fed-092e-4d8e-8f31-2c1d6fb0ea10/util/0.log" Mar 07 08:06:47 crc kubenswrapper[4941]: I0307 08:06:47.814326 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5_8a078fed-092e-4d8e-8f31-2c1d6fb0ea10/util/0.log" Mar 07 08:06:47 crc kubenswrapper[4941]: I0307 08:06:47.833234 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5_8a078fed-092e-4d8e-8f31-2c1d6fb0ea10/pull/0.log" Mar 07 08:06:47 crc kubenswrapper[4941]: I0307 08:06:47.861951 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5_8a078fed-092e-4d8e-8f31-2c1d6fb0ea10/pull/0.log" Mar 07 08:06:48 crc kubenswrapper[4941]: I0307 08:06:48.029440 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5_8a078fed-092e-4d8e-8f31-2c1d6fb0ea10/extract/0.log" Mar 07 08:06:48 crc kubenswrapper[4941]: I0307 08:06:48.072585 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5_8a078fed-092e-4d8e-8f31-2c1d6fb0ea10/pull/0.log" Mar 07 08:06:48 crc kubenswrapper[4941]: I0307 08:06:48.104785 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99rv9l5_8a078fed-092e-4d8e-8f31-2c1d6fb0ea10/util/0.log" Mar 07 08:06:48 crc kubenswrapper[4941]: I0307 08:06:48.478193 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-qlb5v_2158c14b-9b89-48d1-b76f-9b98bbfc6972/manager/0.log" Mar 07 08:06:48 crc kubenswrapper[4941]: I0307 08:06:48.786184 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-8qzr4_a698d941-ce95-43c6-9512-1259d85a4cce/manager/0.log" Mar 07 08:06:48 crc kubenswrapper[4941]: I0307 08:06:48.985539 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-bvmsw_e803a3db-78f9-4d84-96a8-ffff5f62fe09/manager/0.log" Mar 07 08:06:49 crc kubenswrapper[4941]: I0307 08:06:49.142242 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-7wr8g_47672605-5408-4ff2-8b41-557efdcafbaf/manager/0.log" Mar 07 08:06:49 crc kubenswrapper[4941]: I0307 08:06:49.525026 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-kd7rp_4ef2ce4a-5c3e-436c-bd56-dc15ac199bbf/manager/0.log" Mar 07 08:06:49 crc kubenswrapper[4941]: I0307 08:06:49.757903 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-5lfxq_466fcef1-3bd0-4fff-8e3b-c5dbea9cad30/manager/0.log" Mar 07 08:06:50 crc kubenswrapper[4941]: I0307 08:06:50.076544 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-6r26z_5dee621a-ccf7-486f-9865-fba380e4e1b1/manager/0.log" Mar 07 08:06:50 crc kubenswrapper[4941]: I0307 08:06:50.172012 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-mv6rg_16a86642-eb42-44bd-b668-8295e2316f09/manager/0.log" Mar 07 08:06:50 crc kubenswrapper[4941]: I0307 08:06:50.422625 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-sbpft_4194af88-c299-4713-a885-adb8cceedc13/manager/0.log" Mar 07 08:06:50 crc kubenswrapper[4941]: I0307 08:06:50.437561 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-bctkj_f2b19678-f6f3-41fb-8534-f0b826b523f2/manager/0.log" Mar 07 08:06:50 crc kubenswrapper[4941]: I0307 08:06:50.790241 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-d7n9x_b292f34f-0728-4c26-8122-3ac065824456/manager/0.log" Mar 07 08:06:50 crc kubenswrapper[4941]: I0307 08:06:50.857238 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-g86j9_19e5c8c2-a9b7-41be-9a9c-9bc60ddd1478/manager/0.log" Mar 07 08:06:50 crc kubenswrapper[4941]: I0307 08:06:50.978475 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-xjw4j_56d9967c-b8ab-43e8-be9d-0593d1e3f320/manager/0.log" Mar 07 08:06:51 crc kubenswrapper[4941]: I0307 08:06:51.079320 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-dc6dbbbd-tx5gn_63bae88f-5e2e-4e53-9e5e-e7d31ca511d1/manager/0.log" Mar 07 08:06:51 crc kubenswrapper[4941]: I0307 08:06:51.378844 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6f44f7b99f-4bphf_86d787b0-daa0-45e6-8c5f-a540f61ec19a/operator/0.log" Mar 07 08:06:51 crc kubenswrapper[4941]: I0307 08:06:51.610776 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-t85z5_37ec2495-f5bc-49d3-81e8-3fa8a0bea8d3/registry-server/0.log" Mar 07 08:06:51 crc kubenswrapper[4941]: I0307 08:06:51.838730 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-6srmj_0dfad6cb-1bbf-4af8-bd06-efc92bfd4347/manager/0.log" Mar 07 08:06:51 crc kubenswrapper[4941]: I0307 08:06:51.976463 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-kbm66_c0b92dab-fef5-4bf2-b07d-f3787dc8060c/manager/0.log" Mar 07 08:06:52 crc kubenswrapper[4941]: I0307 08:06:52.059457 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wwtfx_42861835-d760-4e61-b9c2-cd0f3e3478d8/operator/0.log" Mar 07 08:06:52 crc kubenswrapper[4941]: I0307 08:06:52.226716 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-pbkqb_a1f803c1-f954-4aff-b54e-2baae04f1bbf/manager/0.log" Mar 07 08:06:52 crc kubenswrapper[4941]: I0307 08:06:52.461322 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-hqpm5_87e61107-4868-497d-a7fa-73f56f084ff2/manager/0.log" Mar 07 08:06:52 crc kubenswrapper[4941]: I0307 08:06:52.488866 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-wtfh5_5dbee8a7-f5e1-44df-ae39-850574975086/manager/0.log" Mar 07 08:06:52 crc kubenswrapper[4941]: I0307 08:06:52.524113 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7dfcb4d64f-9pblb_90ee94e8-276b-476f-a6b6-4729bbd5fab3/manager/0.log" Mar 07 08:06:52 crc kubenswrapper[4941]: I0307 08:06:52.611194 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-8t2kp_12f08e48-5775-4ba5-8321-e68eee8fd2c6/manager/0.log" Mar 07 08:06:57 crc kubenswrapper[4941]: I0307 08:06:57.954135 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:06:57 crc kubenswrapper[4941]: E0307 08:06:57.954803 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:06:57 crc kubenswrapper[4941]: I0307 08:06:57.967582 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-t468c_3df07af4-0aa2-4795-a129-22be2b991b9d/manager/0.log" Mar 07 08:07:10 crc kubenswrapper[4941]: I0307 08:07:10.955036 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:07:10 crc kubenswrapper[4941]: E0307 08:07:10.955881 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:07:12 crc kubenswrapper[4941]: I0307 08:07:12.575936 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-c8tfd_a004c43d-7acc-4a7e-afc1-947c31df55ad/control-plane-machine-set-operator/0.log" Mar 07 08:07:12 crc kubenswrapper[4941]: I0307 08:07:12.771980 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5bq7n_a7c60c91-094d-4c52-9dcb-36ad07c829ad/machine-api-operator/0.log" Mar 07 08:07:12 crc kubenswrapper[4941]: I0307 08:07:12.783555 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5bq7n_a7c60c91-094d-4c52-9dcb-36ad07c829ad/kube-rbac-proxy/0.log" Mar 07 08:07:21 crc kubenswrapper[4941]: I0307 08:07:21.954538 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:07:21 crc kubenswrapper[4941]: E0307 08:07:21.955105 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:07:24 crc kubenswrapper[4941]: I0307 08:07:24.797537 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-v96l6_72309cf8-8de4-4863-be70-0d23cc50d0dc/cert-manager-controller/0.log" Mar 07 08:07:24 crc kubenswrapper[4941]: I0307 08:07:24.984966 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-lwxdj_ffa02105-9624-412f-927b-52fce95120aa/cert-manager-cainjector/0.log" Mar 07 08:07:25 crc kubenswrapper[4941]: I0307 08:07:25.035820 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-pkt46_7b5f2700-be42-4711-b35c-5b90686ebe9a/cert-manager-webhook/0.log" Mar 07 08:07:33 crc kubenswrapper[4941]: I0307 08:07:33.963838 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:07:33 crc kubenswrapper[4941]: E0307 08:07:33.964654 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:07:37 crc kubenswrapper[4941]: I0307 08:07:37.823381 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-vd5f8_ddd11f70-fc7d-478d-8036-62c895c6fb60/nmstate-console-plugin/0.log" Mar 07 08:07:38 crc kubenswrapper[4941]: I0307 08:07:38.004431 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-b4gr5_52b501d5-7ace-4d53-9102-fa7ed37df581/nmstate-handler/0.log" Mar 07 08:07:38 crc kubenswrapper[4941]: I0307 08:07:38.074284 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-fznph_9d22f709-5c4f-4809-8de1-515f401502fe/nmstate-metrics/0.log" Mar 07 08:07:38 crc kubenswrapper[4941]: I0307 08:07:38.102091 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-fznph_9d22f709-5c4f-4809-8de1-515f401502fe/kube-rbac-proxy/0.log" Mar 07 08:07:38 crc kubenswrapper[4941]: I0307 08:07:38.297822 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-77mq6_fa230b05-96f6-4c46-9f79-11ab1d72e453/nmstate-operator/0.log" Mar 07 08:07:38 crc kubenswrapper[4941]: I0307 08:07:38.317150 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-4w5nw_a359df50-440b-47ae-a48d-2ab93b52db74/nmstate-webhook/0.log" Mar 07 08:07:45 crc kubenswrapper[4941]: I0307 08:07:45.955233 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:07:45 crc kubenswrapper[4941]: E0307 08:07:45.956344 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:07:56 crc kubenswrapper[4941]: I0307 08:07:56.954300 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:07:56 crc kubenswrapper[4941]: E0307 08:07:56.954989 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:08:00 crc kubenswrapper[4941]: I0307 08:08:00.140647 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547848-ppjp5"] Mar 07 08:08:00 crc kubenswrapper[4941]: E0307 08:08:00.141932 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46707f7-6b0d-43e9-a652-c17c984f4ccf" containerName="oc" Mar 07 08:08:00 crc kubenswrapper[4941]: I0307 08:08:00.141965 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46707f7-6b0d-43e9-a652-c17c984f4ccf" containerName="oc" Mar 07 08:08:00 crc kubenswrapper[4941]: I0307 08:08:00.142096 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="b46707f7-6b0d-43e9-a652-c17c984f4ccf" containerName="oc" Mar 07 08:08:00 crc kubenswrapper[4941]: I0307 08:08:00.142630 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547848-ppjp5" Mar 07 08:08:00 crc kubenswrapper[4941]: I0307 08:08:00.146107 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 08:08:00 crc kubenswrapper[4941]: I0307 08:08:00.146359 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 08:08:00 crc kubenswrapper[4941]: I0307 08:08:00.153978 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 08:08:00 crc kubenswrapper[4941]: I0307 08:08:00.156765 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547848-ppjp5"] Mar 07 08:08:00 crc kubenswrapper[4941]: I0307 08:08:00.254000 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f8q9\" (UniqueName: \"kubernetes.io/projected/f1880a2a-6aed-4a89-a900-e5cb9a80ec01-kube-api-access-5f8q9\") pod \"auto-csr-approver-29547848-ppjp5\" (UID: \"f1880a2a-6aed-4a89-a900-e5cb9a80ec01\") " pod="openshift-infra/auto-csr-approver-29547848-ppjp5" Mar 07 08:08:00 crc kubenswrapper[4941]: I0307 08:08:00.355666 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f8q9\" (UniqueName: \"kubernetes.io/projected/f1880a2a-6aed-4a89-a900-e5cb9a80ec01-kube-api-access-5f8q9\") pod \"auto-csr-approver-29547848-ppjp5\" (UID: \"f1880a2a-6aed-4a89-a900-e5cb9a80ec01\") " pod="openshift-infra/auto-csr-approver-29547848-ppjp5" Mar 07 08:08:00 crc kubenswrapper[4941]: I0307 08:08:00.375821 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f8q9\" (UniqueName: \"kubernetes.io/projected/f1880a2a-6aed-4a89-a900-e5cb9a80ec01-kube-api-access-5f8q9\") pod \"auto-csr-approver-29547848-ppjp5\" (UID: \"f1880a2a-6aed-4a89-a900-e5cb9a80ec01\") " pod="openshift-infra/auto-csr-approver-29547848-ppjp5" Mar 07 08:08:00 crc kubenswrapper[4941]: I0307 08:08:00.475382 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547848-ppjp5" Mar 07 08:08:01 crc kubenswrapper[4941]: I0307 08:08:01.039279 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547848-ppjp5"] Mar 07 08:08:01 crc kubenswrapper[4941]: W0307 08:08:01.041248 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1880a2a_6aed_4a89_a900_e5cb9a80ec01.slice/crio-b666fc5cb59e425a7bd9f04d735d526669022b2cd61b370c9498227974134502 WatchSource:0}: Error finding container b666fc5cb59e425a7bd9f04d735d526669022b2cd61b370c9498227974134502: Status 404 returned error can't find the container with id b666fc5cb59e425a7bd9f04d735d526669022b2cd61b370c9498227974134502 Mar 07 08:08:01 crc kubenswrapper[4941]: I0307 08:08:01.741435 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547848-ppjp5" event={"ID":"f1880a2a-6aed-4a89-a900-e5cb9a80ec01","Type":"ContainerStarted","Data":"b666fc5cb59e425a7bd9f04d735d526669022b2cd61b370c9498227974134502"} Mar 07 08:08:02 crc kubenswrapper[4941]: I0307 08:08:02.757057 4941 generic.go:334] "Generic (PLEG): container finished" podID="f1880a2a-6aed-4a89-a900-e5cb9a80ec01" containerID="ed30c940a584d68ba93010a6797cae757387257d73de9b8f7a957306fe025361" exitCode=0 Mar 07 08:08:02 crc kubenswrapper[4941]: I0307 08:08:02.757116 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547848-ppjp5" event={"ID":"f1880a2a-6aed-4a89-a900-e5cb9a80ec01","Type":"ContainerDied","Data":"ed30c940a584d68ba93010a6797cae757387257d73de9b8f7a957306fe025361"} Mar 07 08:08:04 crc kubenswrapper[4941]: I0307 08:08:04.092651 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547848-ppjp5" Mar 07 08:08:04 crc kubenswrapper[4941]: I0307 08:08:04.210355 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f8q9\" (UniqueName: \"kubernetes.io/projected/f1880a2a-6aed-4a89-a900-e5cb9a80ec01-kube-api-access-5f8q9\") pod \"f1880a2a-6aed-4a89-a900-e5cb9a80ec01\" (UID: \"f1880a2a-6aed-4a89-a900-e5cb9a80ec01\") " Mar 07 08:08:04 crc kubenswrapper[4941]: I0307 08:08:04.224655 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1880a2a-6aed-4a89-a900-e5cb9a80ec01-kube-api-access-5f8q9" (OuterVolumeSpecName: "kube-api-access-5f8q9") pod "f1880a2a-6aed-4a89-a900-e5cb9a80ec01" (UID: "f1880a2a-6aed-4a89-a900-e5cb9a80ec01"). InnerVolumeSpecName "kube-api-access-5f8q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:08:04 crc kubenswrapper[4941]: I0307 08:08:04.311818 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f8q9\" (UniqueName: \"kubernetes.io/projected/f1880a2a-6aed-4a89-a900-e5cb9a80ec01-kube-api-access-5f8q9\") on node \"crc\" DevicePath \"\"" Mar 07 08:08:04 crc kubenswrapper[4941]: I0307 08:08:04.769566 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547848-ppjp5" event={"ID":"f1880a2a-6aed-4a89-a900-e5cb9a80ec01","Type":"ContainerDied","Data":"b666fc5cb59e425a7bd9f04d735d526669022b2cd61b370c9498227974134502"} Mar 07 08:08:04 crc kubenswrapper[4941]: I0307 08:08:04.769604 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b666fc5cb59e425a7bd9f04d735d526669022b2cd61b370c9498227974134502" Mar 07 08:08:04 crc kubenswrapper[4941]: I0307 08:08:04.769626 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547848-ppjp5" Mar 07 08:08:05 crc kubenswrapper[4941]: I0307 08:08:05.158402 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547842-n4pcb"] Mar 07 08:08:05 crc kubenswrapper[4941]: I0307 08:08:05.163407 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547842-n4pcb"] Mar 07 08:08:05 crc kubenswrapper[4941]: I0307 08:08:05.515297 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-7xzpv_7bade6d7-3a37-4a66-b777-acddd50efb79/kube-rbac-proxy/0.log" Mar 07 08:08:05 crc kubenswrapper[4941]: I0307 08:08:05.663857 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/cp-frr-files/0.log" Mar 07 08:08:05 crc kubenswrapper[4941]: I0307 08:08:05.797537 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-7xzpv_7bade6d7-3a37-4a66-b777-acddd50efb79/controller/0.log" Mar 07 08:08:05 crc kubenswrapper[4941]: I0307 08:08:05.906523 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/cp-metrics/0.log" Mar 07 08:08:05 crc kubenswrapper[4941]: I0307 08:08:05.915043 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/cp-reloader/0.log" Mar 07 08:08:05 crc kubenswrapper[4941]: I0307 08:08:05.939946 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/cp-frr-files/0.log" Mar 07 08:08:05 crc kubenswrapper[4941]: I0307 08:08:05.962798 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa4b3ca-db53-46ff-bae2-e878e699b64d" path="/var/lib/kubelet/pods/1aa4b3ca-db53-46ff-bae2-e878e699b64d/volumes" Mar 07 08:08:05 crc kubenswrapper[4941]: I0307 08:08:05.981853 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/cp-reloader/0.log" Mar 07 08:08:06 crc kubenswrapper[4941]: I0307 08:08:06.133718 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/cp-reloader/0.log" Mar 07 08:08:06 crc kubenswrapper[4941]: I0307 08:08:06.147698 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/cp-metrics/0.log" Mar 07 08:08:06 crc kubenswrapper[4941]: I0307 08:08:06.152239 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/cp-frr-files/0.log" Mar 07 08:08:06 crc kubenswrapper[4941]: I0307 08:08:06.166020 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/cp-metrics/0.log" Mar 07 08:08:06 crc kubenswrapper[4941]: I0307 08:08:06.363104 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/cp-frr-files/0.log" Mar 07 08:08:06 crc kubenswrapper[4941]: I0307 08:08:06.393072 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/controller/0.log" Mar 07 08:08:06 crc kubenswrapper[4941]: I0307 08:08:06.400560 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/cp-reloader/0.log" Mar 07 08:08:06 crc kubenswrapper[4941]: I0307 08:08:06.403355 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/cp-metrics/0.log" Mar 07 08:08:06 crc kubenswrapper[4941]: I0307 08:08:06.591255 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/frr-metrics/0.log" Mar 07 08:08:06 crc kubenswrapper[4941]: I0307 08:08:06.623375 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/kube-rbac-proxy-frr/0.log" Mar 07 08:08:06 crc kubenswrapper[4941]: I0307 08:08:06.624026 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/kube-rbac-proxy/0.log" Mar 07 08:08:06 crc kubenswrapper[4941]: I0307 08:08:06.826564 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-4f2fb_477add90-db8d-449b-bc6f-45618a7e89f6/frr-k8s-webhook-server/0.log" Mar 07 08:08:06 crc kubenswrapper[4941]: I0307 08:08:06.828711 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/reloader/0.log" Mar 07 08:08:07 crc kubenswrapper[4941]: I0307 08:08:07.046896 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5fcd57bf5c-fvjx6_40a1b026-57b0-4461-ab92-8e21f5ba9769/manager/0.log" Mar 07 08:08:07 crc kubenswrapper[4941]: I0307 08:08:07.229061 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-75bb886-4qfz7_a9be712c-d754-4d39-b871-4199924fa125/webhook-server/0.log" Mar 07 08:08:07 crc kubenswrapper[4941]: I0307 08:08:07.323588 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-22x9t_49a7425c-52fd-48b6-a2de-53dc8ab8c531/kube-rbac-proxy/0.log" Mar 07 08:08:07 crc kubenswrapper[4941]: I0307 08:08:07.895888 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-22x9t_49a7425c-52fd-48b6-a2de-53dc8ab8c531/speaker/0.log" Mar 07 08:08:08 crc kubenswrapper[4941]: I0307 08:08:08.075654 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gdcv9_d571499c-14eb-495f-930d-9dafb0a3a093/frr/0.log" Mar 07 08:08:10 crc kubenswrapper[4941]: I0307 08:08:10.408337 4941 scope.go:117] "RemoveContainer" containerID="750999ffbe8c014cc77dc6a1986dfc4d3a05ce8e9a34ee1a1d36bad99eee6318" Mar 07 08:08:10 crc kubenswrapper[4941]: I0307 08:08:10.954614 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:08:10 crc kubenswrapper[4941]: E0307 08:08:10.955142 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:08:21 crc kubenswrapper[4941]: I0307 08:08:21.362550 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g_524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f/util/0.log" Mar 07 08:08:21 crc kubenswrapper[4941]: I0307 08:08:21.730116 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g_524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f/util/0.log" Mar 07 08:08:21 crc kubenswrapper[4941]: I0307 08:08:21.759517 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g_524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f/pull/0.log" Mar 07 08:08:21 crc kubenswrapper[4941]: I0307 08:08:21.789518 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g_524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f/pull/0.log" Mar 07 08:08:21 crc kubenswrapper[4941]: I0307 08:08:21.955162 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:08:21 crc kubenswrapper[4941]: E0307 08:08:21.955419 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:08:22 crc kubenswrapper[4941]: I0307 08:08:22.141029 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g_524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f/util/0.log" Mar 07 08:08:22 crc kubenswrapper[4941]: I0307 08:08:22.183583 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g_524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f/pull/0.log" Mar 07 08:08:22 crc kubenswrapper[4941]: I0307 08:08:22.210732 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82wmc7g_524f2d31-e76d-4fdb-ad91-b6a4b85e3a4f/extract/0.log" Mar 07 08:08:22 crc kubenswrapper[4941]: I0307 08:08:22.363456 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r_25b9610f-6fe6-40b8-868f-0a834314b6c8/util/0.log" Mar 07 08:08:22 crc kubenswrapper[4941]: I0307 08:08:22.495302 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r_25b9610f-6fe6-40b8-868f-0a834314b6c8/util/0.log" Mar 07 08:08:22 crc kubenswrapper[4941]: I0307 08:08:22.560676 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r_25b9610f-6fe6-40b8-868f-0a834314b6c8/pull/0.log" Mar 07 08:08:22 crc kubenswrapper[4941]: I0307 08:08:22.561128 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r_25b9610f-6fe6-40b8-868f-0a834314b6c8/pull/0.log" Mar 07 08:08:22 crc kubenswrapper[4941]: I0307 08:08:22.699753 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r_25b9610f-6fe6-40b8-868f-0a834314b6c8/util/0.log" Mar 07 08:08:22 crc kubenswrapper[4941]: I0307 08:08:22.728163 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r_25b9610f-6fe6-40b8-868f-0a834314b6c8/pull/0.log" Mar 07 08:08:22 crc kubenswrapper[4941]: I0307 08:08:22.737171 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hft6r_25b9610f-6fe6-40b8-868f-0a834314b6c8/extract/0.log" Mar 07 08:08:22 crc kubenswrapper[4941]: I0307 08:08:22.907155 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v6vdl_94a21068-6164-47cd-ae7b-ec85cbed1247/extract-utilities/0.log" Mar 07 08:08:23 crc kubenswrapper[4941]: I0307 08:08:23.151064 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v6vdl_94a21068-6164-47cd-ae7b-ec85cbed1247/extract-utilities/0.log" Mar 07 08:08:23 crc kubenswrapper[4941]: I0307 08:08:23.195303 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v6vdl_94a21068-6164-47cd-ae7b-ec85cbed1247/extract-content/0.log" Mar 07 08:08:23 crc kubenswrapper[4941]: I0307 08:08:23.202597 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v6vdl_94a21068-6164-47cd-ae7b-ec85cbed1247/extract-content/0.log" Mar 07 08:08:23 crc kubenswrapper[4941]: I0307 08:08:23.347069 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v6vdl_94a21068-6164-47cd-ae7b-ec85cbed1247/extract-utilities/0.log" Mar 07 08:08:23 crc kubenswrapper[4941]: I0307 08:08:23.347462 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v6vdl_94a21068-6164-47cd-ae7b-ec85cbed1247/extract-content/0.log" Mar 07 08:08:23 crc kubenswrapper[4941]: I0307 08:08:23.596742 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8sk94_f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59/extract-utilities/0.log" Mar 07 08:08:23 crc kubenswrapper[4941]: I0307 08:08:23.816677 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8sk94_f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59/extract-utilities/0.log" Mar 07 08:08:23 crc kubenswrapper[4941]: I0307 08:08:23.823621 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8sk94_f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59/extract-content/0.log" Mar 07 08:08:23 crc kubenswrapper[4941]: I0307 08:08:23.840212 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8sk94_f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59/extract-content/0.log" Mar 07 08:08:23 crc kubenswrapper[4941]: I0307 08:08:23.948750 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v6vdl_94a21068-6164-47cd-ae7b-ec85cbed1247/registry-server/0.log" Mar 07 08:08:24 crc kubenswrapper[4941]: I0307 08:08:24.006180 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8sk94_f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59/extract-content/0.log" Mar 07 08:08:24 crc kubenswrapper[4941]: I0307 08:08:24.011447 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8sk94_f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59/extract-utilities/0.log" Mar 07 08:08:24 crc kubenswrapper[4941]: I0307 08:08:24.222594 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j_bd5d5308-944e-4c42-a452-049f94c4d06b/util/0.log" Mar 07 08:08:24 crc kubenswrapper[4941]: I0307 08:08:24.292845 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8sk94_f5c8c58f-e4ad-4cbb-abb9-90ee64ad5f59/registry-server/0.log" Mar 07 08:08:24 crc kubenswrapper[4941]: I0307 08:08:24.405746 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j_bd5d5308-944e-4c42-a452-049f94c4d06b/util/0.log" Mar 07 08:08:24 crc kubenswrapper[4941]: I0307 08:08:24.420612 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j_bd5d5308-944e-4c42-a452-049f94c4d06b/pull/0.log" Mar 07 08:08:24 crc kubenswrapper[4941]: I0307 08:08:24.468652 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j_bd5d5308-944e-4c42-a452-049f94c4d06b/pull/0.log" Mar 07 08:08:24 crc kubenswrapper[4941]: I0307 08:08:24.658205 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j_bd5d5308-944e-4c42-a452-049f94c4d06b/extract/0.log" Mar 07 08:08:24 crc kubenswrapper[4941]: I0307 08:08:24.673808 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j_bd5d5308-944e-4c42-a452-049f94c4d06b/util/0.log" Mar 07 08:08:24 crc kubenswrapper[4941]: I0307 08:08:24.676884 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cqz8j_bd5d5308-944e-4c42-a452-049f94c4d06b/pull/0.log" Mar 07 08:08:24 crc kubenswrapper[4941]: I0307 08:08:24.842667 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-l9krl_f6683fc4-e18a-403c-a62a-0c451060c844/marketplace-operator/0.log" Mar 07 08:08:24 crc kubenswrapper[4941]: I0307 08:08:24.893204 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mghxn_1f4849d1-1617-465a-958a-1b31bb5de2df/extract-utilities/0.log" Mar 07 08:08:25 crc kubenswrapper[4941]: I0307 08:08:25.068134 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mghxn_1f4849d1-1617-465a-958a-1b31bb5de2df/extract-content/0.log" Mar 07 08:08:25 crc kubenswrapper[4941]: I0307 08:08:25.090235 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mghxn_1f4849d1-1617-465a-958a-1b31bb5de2df/extract-content/0.log" Mar 07 08:08:25 crc kubenswrapper[4941]: I0307 08:08:25.096393 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mghxn_1f4849d1-1617-465a-958a-1b31bb5de2df/extract-utilities/0.log" Mar 07 08:08:25 crc kubenswrapper[4941]: I0307 08:08:25.264750 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mghxn_1f4849d1-1617-465a-958a-1b31bb5de2df/extract-utilities/0.log" Mar 07 08:08:25 crc kubenswrapper[4941]: I0307 08:08:25.281985 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mghxn_1f4849d1-1617-465a-958a-1b31bb5de2df/extract-content/0.log" Mar 07 08:08:25 crc kubenswrapper[4941]: I0307 08:08:25.422141 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mghxn_1f4849d1-1617-465a-958a-1b31bb5de2df/registry-server/0.log" Mar 07 08:08:25 crc kubenswrapper[4941]: I0307 08:08:25.450775 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bt48m_503fd081-26d9-469a-a201-aac508651d3e/extract-utilities/0.log" Mar 07 08:08:25 crc kubenswrapper[4941]: I0307 08:08:25.593600 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bt48m_503fd081-26d9-469a-a201-aac508651d3e/extract-utilities/0.log" Mar 07 08:08:25 crc kubenswrapper[4941]: I0307 08:08:25.610239 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bt48m_503fd081-26d9-469a-a201-aac508651d3e/extract-content/0.log" Mar 07 08:08:25 crc kubenswrapper[4941]: I0307 08:08:25.647283 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bt48m_503fd081-26d9-469a-a201-aac508651d3e/extract-content/0.log" Mar 07 08:08:25 crc kubenswrapper[4941]: I0307 08:08:25.776072 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bt48m_503fd081-26d9-469a-a201-aac508651d3e/extract-content/0.log" Mar 07 08:08:25 crc kubenswrapper[4941]: I0307 08:08:25.796352 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bt48m_503fd081-26d9-469a-a201-aac508651d3e/extract-utilities/0.log" Mar 07 08:08:26 crc kubenswrapper[4941]: I0307 08:08:26.280248 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bt48m_503fd081-26d9-469a-a201-aac508651d3e/registry-server/0.log" Mar 07 08:08:33 crc kubenswrapper[4941]: I0307 08:08:33.957638 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:08:33 crc kubenswrapper[4941]: E0307 08:08:33.959540 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:08:48 crc kubenswrapper[4941]: I0307 08:08:48.954859 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:08:48 crc kubenswrapper[4941]: E0307 08:08:48.955810 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:09:00 crc kubenswrapper[4941]: I0307 08:09:00.955885 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:09:00 crc kubenswrapper[4941]: E0307 08:09:00.956708 4941 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-knkqz_openshift-machine-config-operator(250d2c0d-993b-466a-a5e0-bacae5fe8df5)\"" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" Mar 07 08:09:12 crc kubenswrapper[4941]: I0307 08:09:12.954680 4941 scope.go:117] "RemoveContainer" containerID="504048a2d5830eed0655c6c363722219fdf01009bbfc7d3796be650b523b632e" Mar 07 08:09:13 crc kubenswrapper[4941]: I0307 08:09:13.227861 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" event={"ID":"250d2c0d-993b-466a-a5e0-bacae5fe8df5","Type":"ContainerStarted","Data":"e95895045043da5f8f7582f956f32beba12fc5b7822f57661a5cee426a532671"} Mar 07 08:09:40 crc kubenswrapper[4941]: I0307 08:09:40.485208 4941 generic.go:334] "Generic (PLEG): container finished" podID="55fc6e25-cfc9-49b7-a641-bfa456c35640" containerID="5d7f890f23e236268ac6755cdfa73cb018b3ea679f2e4ab39a1b247fd4200dd2" exitCode=0 Mar 07 08:09:40 crc kubenswrapper[4941]: I0307 08:09:40.485275 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-krlnv/must-gather-fwwrw" event={"ID":"55fc6e25-cfc9-49b7-a641-bfa456c35640","Type":"ContainerDied","Data":"5d7f890f23e236268ac6755cdfa73cb018b3ea679f2e4ab39a1b247fd4200dd2"} Mar 07 08:09:40 crc kubenswrapper[4941]: I0307 08:09:40.486162 4941 scope.go:117] "RemoveContainer" containerID="5d7f890f23e236268ac6755cdfa73cb018b3ea679f2e4ab39a1b247fd4200dd2" Mar 07 08:09:41 crc kubenswrapper[4941]: I0307 08:09:41.290117 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-krlnv_must-gather-fwwrw_55fc6e25-cfc9-49b7-a641-bfa456c35640/gather/0.log" Mar 07 08:09:49 crc kubenswrapper[4941]: I0307 08:09:49.630984 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-krlnv/must-gather-fwwrw"] Mar 07 08:09:49 crc kubenswrapper[4941]: I0307 08:09:49.632256 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-krlnv/must-gather-fwwrw" podUID="55fc6e25-cfc9-49b7-a641-bfa456c35640" containerName="copy" containerID="cri-o://7fcab115a2e738b363a23f51ff55124c0b7707bcfdf3d47dc31ddb584eff7044" gracePeriod=2 Mar 07 08:09:49 crc kubenswrapper[4941]: I0307 08:09:49.644910 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-krlnv/must-gather-fwwrw"] Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.038618 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-krlnv_must-gather-fwwrw_55fc6e25-cfc9-49b7-a641-bfa456c35640/copy/0.log" Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.039322 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-krlnv/must-gather-fwwrw" Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.147506 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/55fc6e25-cfc9-49b7-a641-bfa456c35640-must-gather-output\") pod \"55fc6e25-cfc9-49b7-a641-bfa456c35640\" (UID: \"55fc6e25-cfc9-49b7-a641-bfa456c35640\") " Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.147607 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mxgt\" (UniqueName: \"kubernetes.io/projected/55fc6e25-cfc9-49b7-a641-bfa456c35640-kube-api-access-2mxgt\") pod \"55fc6e25-cfc9-49b7-a641-bfa456c35640\" (UID: \"55fc6e25-cfc9-49b7-a641-bfa456c35640\") " Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.162397 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55fc6e25-cfc9-49b7-a641-bfa456c35640-kube-api-access-2mxgt" (OuterVolumeSpecName: "kube-api-access-2mxgt") pod "55fc6e25-cfc9-49b7-a641-bfa456c35640" (UID: "55fc6e25-cfc9-49b7-a641-bfa456c35640"). InnerVolumeSpecName "kube-api-access-2mxgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.249073 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mxgt\" (UniqueName: \"kubernetes.io/projected/55fc6e25-cfc9-49b7-a641-bfa456c35640-kube-api-access-2mxgt\") on node \"crc\" DevicePath \"\"" Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.252500 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55fc6e25-cfc9-49b7-a641-bfa456c35640-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "55fc6e25-cfc9-49b7-a641-bfa456c35640" (UID: "55fc6e25-cfc9-49b7-a641-bfa456c35640"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.353034 4941 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/55fc6e25-cfc9-49b7-a641-bfa456c35640-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.569943 4941 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-krlnv_must-gather-fwwrw_55fc6e25-cfc9-49b7-a641-bfa456c35640/copy/0.log" Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.570439 4941 generic.go:334] "Generic (PLEG): container finished" podID="55fc6e25-cfc9-49b7-a641-bfa456c35640" containerID="7fcab115a2e738b363a23f51ff55124c0b7707bcfdf3d47dc31ddb584eff7044" exitCode=143 Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.570510 4941 scope.go:117] "RemoveContainer" containerID="7fcab115a2e738b363a23f51ff55124c0b7707bcfdf3d47dc31ddb584eff7044" Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.570525 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-krlnv/must-gather-fwwrw" Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.599930 4941 scope.go:117] "RemoveContainer" containerID="5d7f890f23e236268ac6755cdfa73cb018b3ea679f2e4ab39a1b247fd4200dd2" Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.659111 4941 scope.go:117] "RemoveContainer" containerID="7fcab115a2e738b363a23f51ff55124c0b7707bcfdf3d47dc31ddb584eff7044" Mar 07 08:09:50 crc kubenswrapper[4941]: E0307 08:09:50.659613 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fcab115a2e738b363a23f51ff55124c0b7707bcfdf3d47dc31ddb584eff7044\": container with ID starting with 7fcab115a2e738b363a23f51ff55124c0b7707bcfdf3d47dc31ddb584eff7044 not found: ID does not exist" containerID="7fcab115a2e738b363a23f51ff55124c0b7707bcfdf3d47dc31ddb584eff7044" Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.659639 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fcab115a2e738b363a23f51ff55124c0b7707bcfdf3d47dc31ddb584eff7044"} err="failed to get container status \"7fcab115a2e738b363a23f51ff55124c0b7707bcfdf3d47dc31ddb584eff7044\": rpc error: code = NotFound desc = could not find container \"7fcab115a2e738b363a23f51ff55124c0b7707bcfdf3d47dc31ddb584eff7044\": container with ID starting with 7fcab115a2e738b363a23f51ff55124c0b7707bcfdf3d47dc31ddb584eff7044 not found: ID does not exist" Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.659658 4941 scope.go:117] "RemoveContainer" containerID="5d7f890f23e236268ac6755cdfa73cb018b3ea679f2e4ab39a1b247fd4200dd2" Mar 07 08:09:50 crc kubenswrapper[4941]: E0307 08:09:50.660030 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d7f890f23e236268ac6755cdfa73cb018b3ea679f2e4ab39a1b247fd4200dd2\": container with ID starting with 5d7f890f23e236268ac6755cdfa73cb018b3ea679f2e4ab39a1b247fd4200dd2 not found: ID does not exist" containerID="5d7f890f23e236268ac6755cdfa73cb018b3ea679f2e4ab39a1b247fd4200dd2" Mar 07 08:09:50 crc kubenswrapper[4941]: I0307 08:09:50.660067 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7f890f23e236268ac6755cdfa73cb018b3ea679f2e4ab39a1b247fd4200dd2"} err="failed to get container status \"5d7f890f23e236268ac6755cdfa73cb018b3ea679f2e4ab39a1b247fd4200dd2\": rpc error: code = NotFound desc = could not find container \"5d7f890f23e236268ac6755cdfa73cb018b3ea679f2e4ab39a1b247fd4200dd2\": container with ID starting with 5d7f890f23e236268ac6755cdfa73cb018b3ea679f2e4ab39a1b247fd4200dd2 not found: ID does not exist" Mar 07 08:09:51 crc kubenswrapper[4941]: I0307 08:09:51.967549 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55fc6e25-cfc9-49b7-a641-bfa456c35640" path="/var/lib/kubelet/pods/55fc6e25-cfc9-49b7-a641-bfa456c35640/volumes" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.175820 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547850-4vqhj"] Mar 07 08:10:00 crc kubenswrapper[4941]: E0307 08:10:00.176758 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fc6e25-cfc9-49b7-a641-bfa456c35640" containerName="gather" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.176778 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fc6e25-cfc9-49b7-a641-bfa456c35640" containerName="gather" Mar 07 08:10:00 crc kubenswrapper[4941]: E0307 08:10:00.176810 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fc6e25-cfc9-49b7-a641-bfa456c35640" containerName="copy" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.176818 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fc6e25-cfc9-49b7-a641-bfa456c35640" containerName="copy" Mar 07 08:10:00 crc kubenswrapper[4941]: E0307 08:10:00.176840 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1880a2a-6aed-4a89-a900-e5cb9a80ec01" containerName="oc" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.176848 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1880a2a-6aed-4a89-a900-e5cb9a80ec01" containerName="oc" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.177018 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fc6e25-cfc9-49b7-a641-bfa456c35640" containerName="copy" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.177039 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fc6e25-cfc9-49b7-a641-bfa456c35640" containerName="gather" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.177053 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1880a2a-6aed-4a89-a900-e5cb9a80ec01" containerName="oc" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.177659 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547850-4vqhj" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.183503 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547850-4vqhj"] Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.189693 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.190008 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.191540 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.203634 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdrcb\" (UniqueName: \"kubernetes.io/projected/13b8bf68-7661-4910-9ea2-d8fb85d2634a-kube-api-access-mdrcb\") pod \"auto-csr-approver-29547850-4vqhj\" (UID: \"13b8bf68-7661-4910-9ea2-d8fb85d2634a\") " pod="openshift-infra/auto-csr-approver-29547850-4vqhj" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.305643 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdrcb\" (UniqueName: \"kubernetes.io/projected/13b8bf68-7661-4910-9ea2-d8fb85d2634a-kube-api-access-mdrcb\") pod \"auto-csr-approver-29547850-4vqhj\" (UID: \"13b8bf68-7661-4910-9ea2-d8fb85d2634a\") " pod="openshift-infra/auto-csr-approver-29547850-4vqhj" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.327505 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdrcb\" (UniqueName: \"kubernetes.io/projected/13b8bf68-7661-4910-9ea2-d8fb85d2634a-kube-api-access-mdrcb\") pod \"auto-csr-approver-29547850-4vqhj\" (UID: \"13b8bf68-7661-4910-9ea2-d8fb85d2634a\") " pod="openshift-infra/auto-csr-approver-29547850-4vqhj" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.517740 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547850-4vqhj" Mar 07 08:10:00 crc kubenswrapper[4941]: I0307 08:10:00.929299 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547850-4vqhj"] Mar 07 08:10:01 crc kubenswrapper[4941]: I0307 08:10:01.652019 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547850-4vqhj" event={"ID":"13b8bf68-7661-4910-9ea2-d8fb85d2634a","Type":"ContainerStarted","Data":"cabff6885d658d01581a74981d69070ee66b971368f2dcc784cfc7a7c8d39983"} Mar 07 08:10:02 crc kubenswrapper[4941]: I0307 08:10:02.662455 4941 generic.go:334] "Generic (PLEG): container finished" podID="13b8bf68-7661-4910-9ea2-d8fb85d2634a" containerID="fd7f4e998b2390fa90313434f6a18bc7ff8be6968e23fc193517f939c969b2c3" exitCode=0 Mar 07 08:10:02 crc kubenswrapper[4941]: I0307 08:10:02.662582 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547850-4vqhj" event={"ID":"13b8bf68-7661-4910-9ea2-d8fb85d2634a","Type":"ContainerDied","Data":"fd7f4e998b2390fa90313434f6a18bc7ff8be6968e23fc193517f939c969b2c3"} Mar 07 08:10:03 crc kubenswrapper[4941]: I0307 08:10:03.993132 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547850-4vqhj" Mar 07 08:10:04 crc kubenswrapper[4941]: I0307 08:10:04.057698 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdrcb\" (UniqueName: \"kubernetes.io/projected/13b8bf68-7661-4910-9ea2-d8fb85d2634a-kube-api-access-mdrcb\") pod \"13b8bf68-7661-4910-9ea2-d8fb85d2634a\" (UID: \"13b8bf68-7661-4910-9ea2-d8fb85d2634a\") " Mar 07 08:10:04 crc kubenswrapper[4941]: I0307 08:10:04.062433 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13b8bf68-7661-4910-9ea2-d8fb85d2634a-kube-api-access-mdrcb" (OuterVolumeSpecName: "kube-api-access-mdrcb") pod "13b8bf68-7661-4910-9ea2-d8fb85d2634a" (UID: "13b8bf68-7661-4910-9ea2-d8fb85d2634a"). InnerVolumeSpecName "kube-api-access-mdrcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:10:04 crc kubenswrapper[4941]: I0307 08:10:04.159176 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdrcb\" (UniqueName: \"kubernetes.io/projected/13b8bf68-7661-4910-9ea2-d8fb85d2634a-kube-api-access-mdrcb\") on node \"crc\" DevicePath \"\"" Mar 07 08:10:04 crc kubenswrapper[4941]: I0307 08:10:04.681340 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547850-4vqhj" Mar 07 08:10:04 crc kubenswrapper[4941]: I0307 08:10:04.681299 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547850-4vqhj" event={"ID":"13b8bf68-7661-4910-9ea2-d8fb85d2634a","Type":"ContainerDied","Data":"cabff6885d658d01581a74981d69070ee66b971368f2dcc784cfc7a7c8d39983"} Mar 07 08:10:04 crc kubenswrapper[4941]: I0307 08:10:04.681646 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cabff6885d658d01581a74981d69070ee66b971368f2dcc784cfc7a7c8d39983" Mar 07 08:10:05 crc kubenswrapper[4941]: I0307 08:10:05.078636 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547844-4ksm7"] Mar 07 08:10:05 crc kubenswrapper[4941]: I0307 08:10:05.089670 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547844-4ksm7"] Mar 07 08:10:05 crc kubenswrapper[4941]: I0307 08:10:05.962185 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af9b7921-0e74-4196-ba1f-c488ddf24b10" path="/var/lib/kubelet/pods/af9b7921-0e74-4196-ba1f-c488ddf24b10/volumes" Mar 07 08:10:10 crc kubenswrapper[4941]: I0307 08:10:10.485946 4941 scope.go:117] "RemoveContainer" containerID="ed383bab6c6642898a4591825b116d1ed82c696e12faf013cf3abd1d4a602cc8" Mar 07 08:10:10 crc kubenswrapper[4941]: I0307 08:10:10.809077 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dbtkj"] Mar 07 08:10:10 crc kubenswrapper[4941]: E0307 08:10:10.809662 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b8bf68-7661-4910-9ea2-d8fb85d2634a" containerName="oc" Mar 07 08:10:10 crc kubenswrapper[4941]: I0307 08:10:10.809682 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b8bf68-7661-4910-9ea2-d8fb85d2634a" containerName="oc" Mar 07 08:10:10 crc kubenswrapper[4941]: I0307 08:10:10.809828 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b8bf68-7661-4910-9ea2-d8fb85d2634a" containerName="oc" Mar 07 08:10:10 crc kubenswrapper[4941]: I0307 08:10:10.810887 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:10 crc kubenswrapper[4941]: I0307 08:10:10.828205 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbtkj"] Mar 07 08:10:10 crc kubenswrapper[4941]: I0307 08:10:10.900170 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx7rm\" (UniqueName: \"kubernetes.io/projected/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-kube-api-access-vx7rm\") pod \"redhat-marketplace-dbtkj\" (UID: \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\") " pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:10 crc kubenswrapper[4941]: I0307 08:10:10.900225 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-catalog-content\") pod \"redhat-marketplace-dbtkj\" (UID: \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\") " pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:10 crc kubenswrapper[4941]: I0307 08:10:10.900320 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-utilities\") pod \"redhat-marketplace-dbtkj\" (UID: \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\") " pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:11 crc kubenswrapper[4941]: I0307 08:10:11.001597 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-utilities\") pod \"redhat-marketplace-dbtkj\" (UID: \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\") " pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:11 crc kubenswrapper[4941]: I0307 08:10:11.001702 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx7rm\" (UniqueName: \"kubernetes.io/projected/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-kube-api-access-vx7rm\") pod \"redhat-marketplace-dbtkj\" (UID: \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\") " pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:11 crc kubenswrapper[4941]: I0307 08:10:11.001737 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-catalog-content\") pod \"redhat-marketplace-dbtkj\" (UID: \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\") " pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:11 crc kubenswrapper[4941]: I0307 08:10:11.002356 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-utilities\") pod \"redhat-marketplace-dbtkj\" (UID: \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\") " pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:11 crc kubenswrapper[4941]: I0307 08:10:11.002592 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-catalog-content\") pod \"redhat-marketplace-dbtkj\" (UID: \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\") " pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:11 crc kubenswrapper[4941]: I0307 08:10:11.028711 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx7rm\" (UniqueName: \"kubernetes.io/projected/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-kube-api-access-vx7rm\") pod \"redhat-marketplace-dbtkj\" (UID: \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\") " pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:11 crc kubenswrapper[4941]: I0307 08:10:11.139580 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:11 crc kubenswrapper[4941]: I0307 08:10:11.447502 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbtkj"] Mar 07 08:10:11 crc kubenswrapper[4941]: I0307 08:10:11.732675 4941 generic.go:334] "Generic (PLEG): container finished" podID="aba6cd33-5205-4ae5-ba7c-18a58c9462b1" containerID="33120006c5acbdaca17b9e29fdf6862d6ebffd3e8d3039812ed17297bea907c7" exitCode=0 Mar 07 08:10:11 crc kubenswrapper[4941]: I0307 08:10:11.732740 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbtkj" event={"ID":"aba6cd33-5205-4ae5-ba7c-18a58c9462b1","Type":"ContainerDied","Data":"33120006c5acbdaca17b9e29fdf6862d6ebffd3e8d3039812ed17297bea907c7"} Mar 07 08:10:11 crc kubenswrapper[4941]: I0307 08:10:11.732780 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbtkj" event={"ID":"aba6cd33-5205-4ae5-ba7c-18a58c9462b1","Type":"ContainerStarted","Data":"1b85aff58e044042076d36bddd989db171131faf25f030462793888c0cbea03f"} Mar 07 08:10:13 crc kubenswrapper[4941]: I0307 08:10:13.760470 4941 generic.go:334] "Generic (PLEG): container finished" podID="aba6cd33-5205-4ae5-ba7c-18a58c9462b1" containerID="fac1e9e7c2245f0869ba11935fd1d2c41263d70605521eadc1eb27edeac40a3b" exitCode=0 Mar 07 08:10:13 crc kubenswrapper[4941]: I0307 08:10:13.760559 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbtkj" event={"ID":"aba6cd33-5205-4ae5-ba7c-18a58c9462b1","Type":"ContainerDied","Data":"fac1e9e7c2245f0869ba11935fd1d2c41263d70605521eadc1eb27edeac40a3b"} Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.395175 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5dnm2"] Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.397281 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.401514 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5dnm2"] Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.460635 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e6c569f-5a19-40ca-9225-db252b79f413-catalog-content\") pod \"community-operators-5dnm2\" (UID: \"2e6c569f-5a19-40ca-9225-db252b79f413\") " pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.460726 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e6c569f-5a19-40ca-9225-db252b79f413-utilities\") pod \"community-operators-5dnm2\" (UID: \"2e6c569f-5a19-40ca-9225-db252b79f413\") " pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.460829 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlm8l\" (UniqueName: \"kubernetes.io/projected/2e6c569f-5a19-40ca-9225-db252b79f413-kube-api-access-vlm8l\") pod \"community-operators-5dnm2\" (UID: \"2e6c569f-5a19-40ca-9225-db252b79f413\") " pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.562138 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e6c569f-5a19-40ca-9225-db252b79f413-catalog-content\") pod \"community-operators-5dnm2\" (UID: \"2e6c569f-5a19-40ca-9225-db252b79f413\") " pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.562566 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e6c569f-5a19-40ca-9225-db252b79f413-utilities\") pod \"community-operators-5dnm2\" (UID: \"2e6c569f-5a19-40ca-9225-db252b79f413\") " pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.562715 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e6c569f-5a19-40ca-9225-db252b79f413-catalog-content\") pod \"community-operators-5dnm2\" (UID: \"2e6c569f-5a19-40ca-9225-db252b79f413\") " pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.562730 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlm8l\" (UniqueName: \"kubernetes.io/projected/2e6c569f-5a19-40ca-9225-db252b79f413-kube-api-access-vlm8l\") pod \"community-operators-5dnm2\" (UID: \"2e6c569f-5a19-40ca-9225-db252b79f413\") " pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.563087 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e6c569f-5a19-40ca-9225-db252b79f413-utilities\") pod \"community-operators-5dnm2\" (UID: \"2e6c569f-5a19-40ca-9225-db252b79f413\") " pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.589505 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlm8l\" (UniqueName: \"kubernetes.io/projected/2e6c569f-5a19-40ca-9225-db252b79f413-kube-api-access-vlm8l\") pod \"community-operators-5dnm2\" (UID: \"2e6c569f-5a19-40ca-9225-db252b79f413\") " pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.732959 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.771698 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbtkj" event={"ID":"aba6cd33-5205-4ae5-ba7c-18a58c9462b1","Type":"ContainerStarted","Data":"371555acf3793de47b9c23747d646f8a172f12ffe5aeb542dc2fb90bd5d2b155"} Mar 07 08:10:14 crc kubenswrapper[4941]: I0307 08:10:14.796165 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dbtkj" podStartSLOduration=2.239891176 podStartE2EDuration="4.796147478s" podCreationTimestamp="2026-03-07 08:10:10 +0000 UTC" firstStartedPulling="2026-03-07 08:10:11.734808283 +0000 UTC m=+4708.687173788" lastFinishedPulling="2026-03-07 08:10:14.291064615 +0000 UTC m=+4711.243430090" observedRunningTime="2026-03-07 08:10:14.788924331 +0000 UTC m=+4711.741289796" watchObservedRunningTime="2026-03-07 08:10:14.796147478 +0000 UTC m=+4711.748512933" Mar 07 08:10:15 crc kubenswrapper[4941]: I0307 08:10:15.232260 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5dnm2"] Mar 07 08:10:15 crc kubenswrapper[4941]: W0307 08:10:15.237430 4941 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e6c569f_5a19_40ca_9225_db252b79f413.slice/crio-96feb7fa22823504c8977206c38efca3801849b9eb779ab2bc0f2e7fd7f8b43d WatchSource:0}: Error finding container 96feb7fa22823504c8977206c38efca3801849b9eb779ab2bc0f2e7fd7f8b43d: Status 404 returned error can't find the container with id 96feb7fa22823504c8977206c38efca3801849b9eb779ab2bc0f2e7fd7f8b43d Mar 07 08:10:15 crc kubenswrapper[4941]: I0307 08:10:15.779991 4941 generic.go:334] "Generic (PLEG): container finished" podID="2e6c569f-5a19-40ca-9225-db252b79f413" containerID="fecdec8edbb511a0ea8cc9999b038e9569001ebaa361701b8d5eb2e615cca9fa" exitCode=0 Mar 07 08:10:15 crc kubenswrapper[4941]: I0307 08:10:15.780036 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dnm2" event={"ID":"2e6c569f-5a19-40ca-9225-db252b79f413","Type":"ContainerDied","Data":"fecdec8edbb511a0ea8cc9999b038e9569001ebaa361701b8d5eb2e615cca9fa"} Mar 07 08:10:15 crc kubenswrapper[4941]: I0307 08:10:15.780317 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dnm2" event={"ID":"2e6c569f-5a19-40ca-9225-db252b79f413","Type":"ContainerStarted","Data":"96feb7fa22823504c8977206c38efca3801849b9eb779ab2bc0f2e7fd7f8b43d"} Mar 07 08:10:16 crc kubenswrapper[4941]: I0307 08:10:16.792493 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dnm2" event={"ID":"2e6c569f-5a19-40ca-9225-db252b79f413","Type":"ContainerStarted","Data":"28bea39bc98ea5a8615edb3d021eecd69db8d8860c00c295c6ccbe36cb54acbe"} Mar 07 08:10:17 crc kubenswrapper[4941]: I0307 08:10:17.800574 4941 generic.go:334] "Generic (PLEG): container finished" podID="2e6c569f-5a19-40ca-9225-db252b79f413" containerID="28bea39bc98ea5a8615edb3d021eecd69db8d8860c00c295c6ccbe36cb54acbe" exitCode=0 Mar 07 08:10:17 crc kubenswrapper[4941]: I0307 08:10:17.800623 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dnm2" event={"ID":"2e6c569f-5a19-40ca-9225-db252b79f413","Type":"ContainerDied","Data":"28bea39bc98ea5a8615edb3d021eecd69db8d8860c00c295c6ccbe36cb54acbe"} Mar 07 08:10:18 crc kubenswrapper[4941]: I0307 08:10:18.807465 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dnm2" event={"ID":"2e6c569f-5a19-40ca-9225-db252b79f413","Type":"ContainerStarted","Data":"79d45436af34930709e939ffabd5bf14385e734aa582c0eb741a8fc4e841833c"} Mar 07 08:10:18 crc kubenswrapper[4941]: I0307 08:10:18.838145 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5dnm2" podStartSLOduration=2.1936556879999998 podStartE2EDuration="4.838125165s" podCreationTimestamp="2026-03-07 08:10:14 +0000 UTC" firstStartedPulling="2026-03-07 08:10:15.781363492 +0000 UTC m=+4712.733728957" lastFinishedPulling="2026-03-07 08:10:18.425832959 +0000 UTC m=+4715.378198434" observedRunningTime="2026-03-07 08:10:18.834060766 +0000 UTC m=+4715.786426231" watchObservedRunningTime="2026-03-07 08:10:18.838125165 +0000 UTC m=+4715.790490630" Mar 07 08:10:21 crc kubenswrapper[4941]: I0307 08:10:21.140847 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:21 crc kubenswrapper[4941]: I0307 08:10:21.141241 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:21 crc kubenswrapper[4941]: I0307 08:10:21.194514 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:21 crc kubenswrapper[4941]: I0307 08:10:21.895049 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:22 crc kubenswrapper[4941]: I0307 08:10:22.372650 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbtkj"] Mar 07 08:10:23 crc kubenswrapper[4941]: I0307 08:10:23.846659 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dbtkj" podUID="aba6cd33-5205-4ae5-ba7c-18a58c9462b1" containerName="registry-server" containerID="cri-o://371555acf3793de47b9c23747d646f8a172f12ffe5aeb542dc2fb90bd5d2b155" gracePeriod=2 Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.371901 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.504989 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-utilities\") pod \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\" (UID: \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\") " Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.505120 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx7rm\" (UniqueName: \"kubernetes.io/projected/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-kube-api-access-vx7rm\") pod \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\" (UID: \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\") " Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.505162 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-catalog-content\") pod \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\" (UID: \"aba6cd33-5205-4ae5-ba7c-18a58c9462b1\") " Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.506719 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-utilities" (OuterVolumeSpecName: "utilities") pod "aba6cd33-5205-4ae5-ba7c-18a58c9462b1" (UID: "aba6cd33-5205-4ae5-ba7c-18a58c9462b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.519905 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-kube-api-access-vx7rm" (OuterVolumeSpecName: "kube-api-access-vx7rm") pod "aba6cd33-5205-4ae5-ba7c-18a58c9462b1" (UID: "aba6cd33-5205-4ae5-ba7c-18a58c9462b1"). InnerVolumeSpecName "kube-api-access-vx7rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.535824 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aba6cd33-5205-4ae5-ba7c-18a58c9462b1" (UID: "aba6cd33-5205-4ae5-ba7c-18a58c9462b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.613164 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.613201 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.613216 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx7rm\" (UniqueName: \"kubernetes.io/projected/aba6cd33-5205-4ae5-ba7c-18a58c9462b1-kube-api-access-vx7rm\") on node \"crc\" DevicePath \"\"" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.733754 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.736207 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.786131 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.855168 4941 generic.go:334] "Generic (PLEG): container finished" podID="aba6cd33-5205-4ae5-ba7c-18a58c9462b1" containerID="371555acf3793de47b9c23747d646f8a172f12ffe5aeb542dc2fb90bd5d2b155" exitCode=0 Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.855285 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbtkj" event={"ID":"aba6cd33-5205-4ae5-ba7c-18a58c9462b1","Type":"ContainerDied","Data":"371555acf3793de47b9c23747d646f8a172f12ffe5aeb542dc2fb90bd5d2b155"} Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.855321 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbtkj" event={"ID":"aba6cd33-5205-4ae5-ba7c-18a58c9462b1","Type":"ContainerDied","Data":"1b85aff58e044042076d36bddd989db171131faf25f030462793888c0cbea03f"} Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.855341 4941 scope.go:117] "RemoveContainer" containerID="371555acf3793de47b9c23747d646f8a172f12ffe5aeb542dc2fb90bd5d2b155" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.856652 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbtkj" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.880475 4941 scope.go:117] "RemoveContainer" containerID="fac1e9e7c2245f0869ba11935fd1d2c41263d70605521eadc1eb27edeac40a3b" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.896556 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbtkj"] Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.911168 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbtkj"] Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.914688 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.936681 4941 scope.go:117] "RemoveContainer" containerID="33120006c5acbdaca17b9e29fdf6862d6ebffd3e8d3039812ed17297bea907c7" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.968007 4941 scope.go:117] "RemoveContainer" containerID="371555acf3793de47b9c23747d646f8a172f12ffe5aeb542dc2fb90bd5d2b155" Mar 07 08:10:24 crc kubenswrapper[4941]: E0307 08:10:24.968567 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"371555acf3793de47b9c23747d646f8a172f12ffe5aeb542dc2fb90bd5d2b155\": container with ID starting with 371555acf3793de47b9c23747d646f8a172f12ffe5aeb542dc2fb90bd5d2b155 not found: ID does not exist" containerID="371555acf3793de47b9c23747d646f8a172f12ffe5aeb542dc2fb90bd5d2b155" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.968619 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"371555acf3793de47b9c23747d646f8a172f12ffe5aeb542dc2fb90bd5d2b155"} err="failed to get container status \"371555acf3793de47b9c23747d646f8a172f12ffe5aeb542dc2fb90bd5d2b155\": rpc error: code = NotFound desc = could not find container \"371555acf3793de47b9c23747d646f8a172f12ffe5aeb542dc2fb90bd5d2b155\": container with ID starting with 371555acf3793de47b9c23747d646f8a172f12ffe5aeb542dc2fb90bd5d2b155 not found: ID does not exist" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.968653 4941 scope.go:117] "RemoveContainer" containerID="fac1e9e7c2245f0869ba11935fd1d2c41263d70605521eadc1eb27edeac40a3b" Mar 07 08:10:24 crc kubenswrapper[4941]: E0307 08:10:24.969104 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac1e9e7c2245f0869ba11935fd1d2c41263d70605521eadc1eb27edeac40a3b\": container with ID starting with fac1e9e7c2245f0869ba11935fd1d2c41263d70605521eadc1eb27edeac40a3b not found: ID does not exist" containerID="fac1e9e7c2245f0869ba11935fd1d2c41263d70605521eadc1eb27edeac40a3b" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.969146 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac1e9e7c2245f0869ba11935fd1d2c41263d70605521eadc1eb27edeac40a3b"} err="failed to get container status \"fac1e9e7c2245f0869ba11935fd1d2c41263d70605521eadc1eb27edeac40a3b\": rpc error: code = NotFound desc = could not find container \"fac1e9e7c2245f0869ba11935fd1d2c41263d70605521eadc1eb27edeac40a3b\": container with ID starting with fac1e9e7c2245f0869ba11935fd1d2c41263d70605521eadc1eb27edeac40a3b not found: ID does not exist" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.969165 4941 scope.go:117] "RemoveContainer" containerID="33120006c5acbdaca17b9e29fdf6862d6ebffd3e8d3039812ed17297bea907c7" Mar 07 08:10:24 crc kubenswrapper[4941]: E0307 08:10:24.969575 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33120006c5acbdaca17b9e29fdf6862d6ebffd3e8d3039812ed17297bea907c7\": container with ID starting with 33120006c5acbdaca17b9e29fdf6862d6ebffd3e8d3039812ed17297bea907c7 not found: ID does not exist" containerID="33120006c5acbdaca17b9e29fdf6862d6ebffd3e8d3039812ed17297bea907c7" Mar 07 08:10:24 crc kubenswrapper[4941]: I0307 08:10:24.969597 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33120006c5acbdaca17b9e29fdf6862d6ebffd3e8d3039812ed17297bea907c7"} err="failed to get container status \"33120006c5acbdaca17b9e29fdf6862d6ebffd3e8d3039812ed17297bea907c7\": rpc error: code = NotFound desc = could not find container \"33120006c5acbdaca17b9e29fdf6862d6ebffd3e8d3039812ed17297bea907c7\": container with ID starting with 33120006c5acbdaca17b9e29fdf6862d6ebffd3e8d3039812ed17297bea907c7 not found: ID does not exist" Mar 07 08:10:25 crc kubenswrapper[4941]: I0307 08:10:25.970667 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aba6cd33-5205-4ae5-ba7c-18a58c9462b1" path="/var/lib/kubelet/pods/aba6cd33-5205-4ae5-ba7c-18a58c9462b1/volumes" Mar 07 08:10:26 crc kubenswrapper[4941]: I0307 08:10:26.574640 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5dnm2"] Mar 07 08:10:27 crc kubenswrapper[4941]: I0307 08:10:27.879367 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5dnm2" podUID="2e6c569f-5a19-40ca-9225-db252b79f413" containerName="registry-server" containerID="cri-o://79d45436af34930709e939ffabd5bf14385e734aa582c0eb741a8fc4e841833c" gracePeriod=2 Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.641228 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.674362 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e6c569f-5a19-40ca-9225-db252b79f413-utilities\") pod \"2e6c569f-5a19-40ca-9225-db252b79f413\" (UID: \"2e6c569f-5a19-40ca-9225-db252b79f413\") " Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.674490 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlm8l\" (UniqueName: \"kubernetes.io/projected/2e6c569f-5a19-40ca-9225-db252b79f413-kube-api-access-vlm8l\") pod \"2e6c569f-5a19-40ca-9225-db252b79f413\" (UID: \"2e6c569f-5a19-40ca-9225-db252b79f413\") " Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.674712 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e6c569f-5a19-40ca-9225-db252b79f413-catalog-content\") pod \"2e6c569f-5a19-40ca-9225-db252b79f413\" (UID: \"2e6c569f-5a19-40ca-9225-db252b79f413\") " Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.675708 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6c569f-5a19-40ca-9225-db252b79f413-utilities" (OuterVolumeSpecName: "utilities") pod "2e6c569f-5a19-40ca-9225-db252b79f413" (UID: "2e6c569f-5a19-40ca-9225-db252b79f413"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.681275 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6c569f-5a19-40ca-9225-db252b79f413-kube-api-access-vlm8l" (OuterVolumeSpecName: "kube-api-access-vlm8l") pod "2e6c569f-5a19-40ca-9225-db252b79f413" (UID: "2e6c569f-5a19-40ca-9225-db252b79f413"). InnerVolumeSpecName "kube-api-access-vlm8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.727330 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6c569f-5a19-40ca-9225-db252b79f413-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e6c569f-5a19-40ca-9225-db252b79f413" (UID: "2e6c569f-5a19-40ca-9225-db252b79f413"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.777325 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e6c569f-5a19-40ca-9225-db252b79f413-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.777366 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e6c569f-5a19-40ca-9225-db252b79f413-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.777377 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlm8l\" (UniqueName: \"kubernetes.io/projected/2e6c569f-5a19-40ca-9225-db252b79f413-kube-api-access-vlm8l\") on node \"crc\" DevicePath \"\"" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.894116 4941 generic.go:334] "Generic (PLEG): container finished" podID="2e6c569f-5a19-40ca-9225-db252b79f413" containerID="79d45436af34930709e939ffabd5bf14385e734aa582c0eb741a8fc4e841833c" exitCode=0 Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.894215 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dnm2" event={"ID":"2e6c569f-5a19-40ca-9225-db252b79f413","Type":"ContainerDied","Data":"79d45436af34930709e939ffabd5bf14385e734aa582c0eb741a8fc4e841833c"} Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.894273 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dnm2" event={"ID":"2e6c569f-5a19-40ca-9225-db252b79f413","Type":"ContainerDied","Data":"96feb7fa22823504c8977206c38efca3801849b9eb779ab2bc0f2e7fd7f8b43d"} Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.894336 4941 scope.go:117] "RemoveContainer" containerID="79d45436af34930709e939ffabd5bf14385e734aa582c0eb741a8fc4e841833c" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.894715 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5dnm2" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.915828 4941 scope.go:117] "RemoveContainer" containerID="28bea39bc98ea5a8615edb3d021eecd69db8d8860c00c295c6ccbe36cb54acbe" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.938398 4941 scope.go:117] "RemoveContainer" containerID="fecdec8edbb511a0ea8cc9999b038e9569001ebaa361701b8d5eb2e615cca9fa" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.953818 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5dnm2"] Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.965707 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5dnm2"] Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.980594 4941 scope.go:117] "RemoveContainer" containerID="79d45436af34930709e939ffabd5bf14385e734aa582c0eb741a8fc4e841833c" Mar 07 08:10:28 crc kubenswrapper[4941]: E0307 08:10:28.981086 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79d45436af34930709e939ffabd5bf14385e734aa582c0eb741a8fc4e841833c\": container with ID starting with 79d45436af34930709e939ffabd5bf14385e734aa582c0eb741a8fc4e841833c not found: ID does not exist" containerID="79d45436af34930709e939ffabd5bf14385e734aa582c0eb741a8fc4e841833c" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.981123 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79d45436af34930709e939ffabd5bf14385e734aa582c0eb741a8fc4e841833c"} err="failed to get container status \"79d45436af34930709e939ffabd5bf14385e734aa582c0eb741a8fc4e841833c\": rpc error: code = NotFound desc = could not find container \"79d45436af34930709e939ffabd5bf14385e734aa582c0eb741a8fc4e841833c\": container with ID starting with 79d45436af34930709e939ffabd5bf14385e734aa582c0eb741a8fc4e841833c not found: ID does not exist" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.981150 4941 scope.go:117] "RemoveContainer" containerID="28bea39bc98ea5a8615edb3d021eecd69db8d8860c00c295c6ccbe36cb54acbe" Mar 07 08:10:28 crc kubenswrapper[4941]: E0307 08:10:28.981385 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28bea39bc98ea5a8615edb3d021eecd69db8d8860c00c295c6ccbe36cb54acbe\": container with ID starting with 28bea39bc98ea5a8615edb3d021eecd69db8d8860c00c295c6ccbe36cb54acbe not found: ID does not exist" containerID="28bea39bc98ea5a8615edb3d021eecd69db8d8860c00c295c6ccbe36cb54acbe" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.981432 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28bea39bc98ea5a8615edb3d021eecd69db8d8860c00c295c6ccbe36cb54acbe"} err="failed to get container status \"28bea39bc98ea5a8615edb3d021eecd69db8d8860c00c295c6ccbe36cb54acbe\": rpc error: code = NotFound desc = could not find container \"28bea39bc98ea5a8615edb3d021eecd69db8d8860c00c295c6ccbe36cb54acbe\": container with ID starting with 28bea39bc98ea5a8615edb3d021eecd69db8d8860c00c295c6ccbe36cb54acbe not found: ID does not exist" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.981449 4941 scope.go:117] "RemoveContainer" containerID="fecdec8edbb511a0ea8cc9999b038e9569001ebaa361701b8d5eb2e615cca9fa" Mar 07 08:10:28 crc kubenswrapper[4941]: E0307 08:10:28.981753 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fecdec8edbb511a0ea8cc9999b038e9569001ebaa361701b8d5eb2e615cca9fa\": container with ID starting with fecdec8edbb511a0ea8cc9999b038e9569001ebaa361701b8d5eb2e615cca9fa not found: ID does not exist" containerID="fecdec8edbb511a0ea8cc9999b038e9569001ebaa361701b8d5eb2e615cca9fa" Mar 07 08:10:28 crc kubenswrapper[4941]: I0307 08:10:28.981790 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fecdec8edbb511a0ea8cc9999b038e9569001ebaa361701b8d5eb2e615cca9fa"} err="failed to get container status \"fecdec8edbb511a0ea8cc9999b038e9569001ebaa361701b8d5eb2e615cca9fa\": rpc error: code = NotFound desc = could not find container \"fecdec8edbb511a0ea8cc9999b038e9569001ebaa361701b8d5eb2e615cca9fa\": container with ID starting with fecdec8edbb511a0ea8cc9999b038e9569001ebaa361701b8d5eb2e615cca9fa not found: ID does not exist" Mar 07 08:10:29 crc kubenswrapper[4941]: I0307 08:10:29.972090 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6c569f-5a19-40ca-9225-db252b79f413" path="/var/lib/kubelet/pods/2e6c569f-5a19-40ca-9225-db252b79f413/volumes" Mar 07 08:11:40 crc kubenswrapper[4941]: I0307 08:11:40.314292 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 08:11:40 crc kubenswrapper[4941]: I0307 08:11:40.314946 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 08:11:51 crc kubenswrapper[4941]: I0307 08:11:51.737773 4941 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-kbm66" podUID="c0b92dab-fef5-4bf2-b07d-f3787dc8060c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.91:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.076051 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dcrks"] Mar 07 08:11:56 crc kubenswrapper[4941]: E0307 08:11:56.077021 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6c569f-5a19-40ca-9225-db252b79f413" containerName="extract-utilities" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.077038 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6c569f-5a19-40ca-9225-db252b79f413" containerName="extract-utilities" Mar 07 08:11:56 crc kubenswrapper[4941]: E0307 08:11:56.077051 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6c569f-5a19-40ca-9225-db252b79f413" containerName="registry-server" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.077060 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6c569f-5a19-40ca-9225-db252b79f413" containerName="registry-server" Mar 07 08:11:56 crc kubenswrapper[4941]: E0307 08:11:56.077078 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba6cd33-5205-4ae5-ba7c-18a58c9462b1" containerName="extract-utilities" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.077085 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba6cd33-5205-4ae5-ba7c-18a58c9462b1" containerName="extract-utilities" Mar 07 08:11:56 crc kubenswrapper[4941]: E0307 08:11:56.077106 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba6cd33-5205-4ae5-ba7c-18a58c9462b1" containerName="registry-server" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.077114 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba6cd33-5205-4ae5-ba7c-18a58c9462b1" containerName="registry-server" Mar 07 08:11:56 crc kubenswrapper[4941]: E0307 08:11:56.077129 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6c569f-5a19-40ca-9225-db252b79f413" containerName="extract-content" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.077135 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6c569f-5a19-40ca-9225-db252b79f413" containerName="extract-content" Mar 07 08:11:56 crc kubenswrapper[4941]: E0307 08:11:56.077144 4941 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba6cd33-5205-4ae5-ba7c-18a58c9462b1" containerName="extract-content" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.077150 4941 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba6cd33-5205-4ae5-ba7c-18a58c9462b1" containerName="extract-content" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.077312 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="aba6cd33-5205-4ae5-ba7c-18a58c9462b1" containerName="registry-server" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.077328 4941 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6c569f-5a19-40ca-9225-db252b79f413" containerName="registry-server" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.078383 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.105632 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dcrks"] Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.208758 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b687e2-7ad6-4d27-ab47-a98f7127f022-catalog-content\") pod \"certified-operators-dcrks\" (UID: \"90b687e2-7ad6-4d27-ab47-a98f7127f022\") " pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.208823 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brb6d\" (UniqueName: \"kubernetes.io/projected/90b687e2-7ad6-4d27-ab47-a98f7127f022-kube-api-access-brb6d\") pod \"certified-operators-dcrks\" (UID: \"90b687e2-7ad6-4d27-ab47-a98f7127f022\") " pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.208854 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b687e2-7ad6-4d27-ab47-a98f7127f022-utilities\") pod \"certified-operators-dcrks\" (UID: \"90b687e2-7ad6-4d27-ab47-a98f7127f022\") " pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.309679 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b687e2-7ad6-4d27-ab47-a98f7127f022-utilities\") pod \"certified-operators-dcrks\" (UID: \"90b687e2-7ad6-4d27-ab47-a98f7127f022\") " pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.309825 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b687e2-7ad6-4d27-ab47-a98f7127f022-catalog-content\") pod \"certified-operators-dcrks\" (UID: \"90b687e2-7ad6-4d27-ab47-a98f7127f022\") " pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.309863 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brb6d\" (UniqueName: \"kubernetes.io/projected/90b687e2-7ad6-4d27-ab47-a98f7127f022-kube-api-access-brb6d\") pod \"certified-operators-dcrks\" (UID: \"90b687e2-7ad6-4d27-ab47-a98f7127f022\") " pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.310238 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b687e2-7ad6-4d27-ab47-a98f7127f022-utilities\") pod \"certified-operators-dcrks\" (UID: \"90b687e2-7ad6-4d27-ab47-a98f7127f022\") " pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.310243 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b687e2-7ad6-4d27-ab47-a98f7127f022-catalog-content\") pod \"certified-operators-dcrks\" (UID: \"90b687e2-7ad6-4d27-ab47-a98f7127f022\") " pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.337270 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brb6d\" (UniqueName: \"kubernetes.io/projected/90b687e2-7ad6-4d27-ab47-a98f7127f022-kube-api-access-brb6d\") pod \"certified-operators-dcrks\" (UID: \"90b687e2-7ad6-4d27-ab47-a98f7127f022\") " pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.395620 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:11:56 crc kubenswrapper[4941]: I0307 08:11:56.847710 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dcrks"] Mar 07 08:11:57 crc kubenswrapper[4941]: I0307 08:11:57.360279 4941 generic.go:334] "Generic (PLEG): container finished" podID="90b687e2-7ad6-4d27-ab47-a98f7127f022" containerID="bb258a2b181edc5124503d53bdb173fe5304d823b16d39a425f9e3ec3edf2556" exitCode=0 Mar 07 08:11:57 crc kubenswrapper[4941]: I0307 08:11:57.360343 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcrks" event={"ID":"90b687e2-7ad6-4d27-ab47-a98f7127f022","Type":"ContainerDied","Data":"bb258a2b181edc5124503d53bdb173fe5304d823b16d39a425f9e3ec3edf2556"} Mar 07 08:11:57 crc kubenswrapper[4941]: I0307 08:11:57.360636 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcrks" event={"ID":"90b687e2-7ad6-4d27-ab47-a98f7127f022","Type":"ContainerStarted","Data":"b3a68aec29b2440f668e19a13b21627f95fa5d8aadaa84893ef1b9ae44aeaacf"} Mar 07 08:11:57 crc kubenswrapper[4941]: I0307 08:11:57.362920 4941 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 08:11:59 crc kubenswrapper[4941]: I0307 08:11:59.379083 4941 generic.go:334] "Generic (PLEG): container finished" podID="90b687e2-7ad6-4d27-ab47-a98f7127f022" containerID="c1ec15e7dee7604cafc003f3771d287d55f3d3571b227122b85390f370d0bf79" exitCode=0 Mar 07 08:11:59 crc kubenswrapper[4941]: I0307 08:11:59.379193 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcrks" event={"ID":"90b687e2-7ad6-4d27-ab47-a98f7127f022","Type":"ContainerDied","Data":"c1ec15e7dee7604cafc003f3771d287d55f3d3571b227122b85390f370d0bf79"} Mar 07 08:12:00 crc kubenswrapper[4941]: I0307 08:12:00.159541 4941 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547852-m888d"] Mar 07 08:12:00 crc kubenswrapper[4941]: I0307 08:12:00.160764 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547852-m888d" Mar 07 08:12:00 crc kubenswrapper[4941]: I0307 08:12:00.162589 4941 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnv4c" Mar 07 08:12:00 crc kubenswrapper[4941]: I0307 08:12:00.163924 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 08:12:00 crc kubenswrapper[4941]: I0307 08:12:00.163951 4941 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 08:12:00 crc kubenswrapper[4941]: I0307 08:12:00.166356 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547852-m888d"] Mar 07 08:12:00 crc kubenswrapper[4941]: I0307 08:12:00.267947 4941 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlhdj\" (UniqueName: \"kubernetes.io/projected/93f4c55b-aded-40bc-8cbf-b710a64fd40e-kube-api-access-dlhdj\") pod \"auto-csr-approver-29547852-m888d\" (UID: \"93f4c55b-aded-40bc-8cbf-b710a64fd40e\") " pod="openshift-infra/auto-csr-approver-29547852-m888d" Mar 07 08:12:00 crc kubenswrapper[4941]: I0307 08:12:00.372088 4941 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlhdj\" (UniqueName: \"kubernetes.io/projected/93f4c55b-aded-40bc-8cbf-b710a64fd40e-kube-api-access-dlhdj\") pod \"auto-csr-approver-29547852-m888d\" (UID: \"93f4c55b-aded-40bc-8cbf-b710a64fd40e\") " pod="openshift-infra/auto-csr-approver-29547852-m888d" Mar 07 08:12:00 crc kubenswrapper[4941]: I0307 08:12:00.390179 4941 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlhdj\" (UniqueName: \"kubernetes.io/projected/93f4c55b-aded-40bc-8cbf-b710a64fd40e-kube-api-access-dlhdj\") pod \"auto-csr-approver-29547852-m888d\" (UID: \"93f4c55b-aded-40bc-8cbf-b710a64fd40e\") " pod="openshift-infra/auto-csr-approver-29547852-m888d" Mar 07 08:12:00 crc kubenswrapper[4941]: I0307 08:12:00.489105 4941 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547852-m888d" Mar 07 08:12:00 crc kubenswrapper[4941]: I0307 08:12:00.740877 4941 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547852-m888d"] Mar 07 08:12:01 crc kubenswrapper[4941]: I0307 08:12:01.393536 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547852-m888d" event={"ID":"93f4c55b-aded-40bc-8cbf-b710a64fd40e","Type":"ContainerStarted","Data":"614279c0fd02ee33cd05a959a2ff9701c416f7e0fe895b2ba6943bdd7db5f9f4"} Mar 07 08:12:01 crc kubenswrapper[4941]: I0307 08:12:01.396499 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcrks" event={"ID":"90b687e2-7ad6-4d27-ab47-a98f7127f022","Type":"ContainerStarted","Data":"00510bce15ba09e29ee414f94f7e830c0d9e47813295f572bedbbbc27be953ea"} Mar 07 08:12:02 crc kubenswrapper[4941]: I0307 08:12:02.405351 4941 generic.go:334] "Generic (PLEG): container finished" podID="93f4c55b-aded-40bc-8cbf-b710a64fd40e" containerID="b9f698815ba0da567a91645bcfaae579b0bbad13dcb03f9365f61395216eb508" exitCode=0 Mar 07 08:12:02 crc kubenswrapper[4941]: I0307 08:12:02.405424 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547852-m888d" event={"ID":"93f4c55b-aded-40bc-8cbf-b710a64fd40e","Type":"ContainerDied","Data":"b9f698815ba0da567a91645bcfaae579b0bbad13dcb03f9365f61395216eb508"} Mar 07 08:12:02 crc kubenswrapper[4941]: I0307 08:12:02.422918 4941 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dcrks" podStartSLOduration=3.937583 podStartE2EDuration="6.422898932s" podCreationTimestamp="2026-03-07 08:11:56 +0000 UTC" firstStartedPulling="2026-03-07 08:11:57.362630689 +0000 UTC m=+4814.314996154" lastFinishedPulling="2026-03-07 08:11:59.847946621 +0000 UTC m=+4816.800312086" observedRunningTime="2026-03-07 08:12:01.417227556 +0000 UTC m=+4818.369593031" watchObservedRunningTime="2026-03-07 08:12:02.422898932 +0000 UTC m=+4819.375264407" Mar 07 08:12:03 crc kubenswrapper[4941]: I0307 08:12:03.706359 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547852-m888d" Mar 07 08:12:03 crc kubenswrapper[4941]: I0307 08:12:03.826584 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlhdj\" (UniqueName: \"kubernetes.io/projected/93f4c55b-aded-40bc-8cbf-b710a64fd40e-kube-api-access-dlhdj\") pod \"93f4c55b-aded-40bc-8cbf-b710a64fd40e\" (UID: \"93f4c55b-aded-40bc-8cbf-b710a64fd40e\") " Mar 07 08:12:03 crc kubenswrapper[4941]: I0307 08:12:03.835011 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f4c55b-aded-40bc-8cbf-b710a64fd40e-kube-api-access-dlhdj" (OuterVolumeSpecName: "kube-api-access-dlhdj") pod "93f4c55b-aded-40bc-8cbf-b710a64fd40e" (UID: "93f4c55b-aded-40bc-8cbf-b710a64fd40e"). InnerVolumeSpecName "kube-api-access-dlhdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:12:03 crc kubenswrapper[4941]: I0307 08:12:03.928999 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlhdj\" (UniqueName: \"kubernetes.io/projected/93f4c55b-aded-40bc-8cbf-b710a64fd40e-kube-api-access-dlhdj\") on node \"crc\" DevicePath \"\"" Mar 07 08:12:04 crc kubenswrapper[4941]: I0307 08:12:04.424276 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547852-m888d" event={"ID":"93f4c55b-aded-40bc-8cbf-b710a64fd40e","Type":"ContainerDied","Data":"614279c0fd02ee33cd05a959a2ff9701c416f7e0fe895b2ba6943bdd7db5f9f4"} Mar 07 08:12:04 crc kubenswrapper[4941]: I0307 08:12:04.424317 4941 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="614279c0fd02ee33cd05a959a2ff9701c416f7e0fe895b2ba6943bdd7db5f9f4" Mar 07 08:12:04 crc kubenswrapper[4941]: I0307 08:12:04.424384 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547852-m888d" Mar 07 08:12:04 crc kubenswrapper[4941]: I0307 08:12:04.774579 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547846-kt4cw"] Mar 07 08:12:04 crc kubenswrapper[4941]: I0307 08:12:04.782007 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547846-kt4cw"] Mar 07 08:12:05 crc kubenswrapper[4941]: I0307 08:12:05.967070 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b46707f7-6b0d-43e9-a652-c17c984f4ccf" path="/var/lib/kubelet/pods/b46707f7-6b0d-43e9-a652-c17c984f4ccf/volumes" Mar 07 08:12:06 crc kubenswrapper[4941]: I0307 08:12:06.395863 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:12:06 crc kubenswrapper[4941]: I0307 08:12:06.395916 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:12:06 crc kubenswrapper[4941]: I0307 08:12:06.459298 4941 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:12:06 crc kubenswrapper[4941]: I0307 08:12:06.506125 4941 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:12:06 crc kubenswrapper[4941]: I0307 08:12:06.699936 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dcrks"] Mar 07 08:12:08 crc kubenswrapper[4941]: I0307 08:12:08.468064 4941 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dcrks" podUID="90b687e2-7ad6-4d27-ab47-a98f7127f022" containerName="registry-server" containerID="cri-o://00510bce15ba09e29ee414f94f7e830c0d9e47813295f572bedbbbc27be953ea" gracePeriod=2 Mar 07 08:12:08 crc kubenswrapper[4941]: I0307 08:12:08.887132 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.006287 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b687e2-7ad6-4d27-ab47-a98f7127f022-utilities\") pod \"90b687e2-7ad6-4d27-ab47-a98f7127f022\" (UID: \"90b687e2-7ad6-4d27-ab47-a98f7127f022\") " Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.006941 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brb6d\" (UniqueName: \"kubernetes.io/projected/90b687e2-7ad6-4d27-ab47-a98f7127f022-kube-api-access-brb6d\") pod \"90b687e2-7ad6-4d27-ab47-a98f7127f022\" (UID: \"90b687e2-7ad6-4d27-ab47-a98f7127f022\") " Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.007034 4941 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b687e2-7ad6-4d27-ab47-a98f7127f022-catalog-content\") pod \"90b687e2-7ad6-4d27-ab47-a98f7127f022\" (UID: \"90b687e2-7ad6-4d27-ab47-a98f7127f022\") " Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.007825 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b687e2-7ad6-4d27-ab47-a98f7127f022-utilities" (OuterVolumeSpecName: "utilities") pod "90b687e2-7ad6-4d27-ab47-a98f7127f022" (UID: "90b687e2-7ad6-4d27-ab47-a98f7127f022"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.013134 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b687e2-7ad6-4d27-ab47-a98f7127f022-kube-api-access-brb6d" (OuterVolumeSpecName: "kube-api-access-brb6d") pod "90b687e2-7ad6-4d27-ab47-a98f7127f022" (UID: "90b687e2-7ad6-4d27-ab47-a98f7127f022"). InnerVolumeSpecName "kube-api-access-brb6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.109353 4941 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brb6d\" (UniqueName: \"kubernetes.io/projected/90b687e2-7ad6-4d27-ab47-a98f7127f022-kube-api-access-brb6d\") on node \"crc\" DevicePath \"\"" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.109439 4941 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b687e2-7ad6-4d27-ab47-a98f7127f022-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.416328 4941 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b687e2-7ad6-4d27-ab47-a98f7127f022-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90b687e2-7ad6-4d27-ab47-a98f7127f022" (UID: "90b687e2-7ad6-4d27-ab47-a98f7127f022"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.478959 4941 generic.go:334] "Generic (PLEG): container finished" podID="90b687e2-7ad6-4d27-ab47-a98f7127f022" containerID="00510bce15ba09e29ee414f94f7e830c0d9e47813295f572bedbbbc27be953ea" exitCode=0 Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.479020 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcrks" event={"ID":"90b687e2-7ad6-4d27-ab47-a98f7127f022","Type":"ContainerDied","Data":"00510bce15ba09e29ee414f94f7e830c0d9e47813295f572bedbbbc27be953ea"} Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.479031 4941 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcrks" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.479058 4941 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcrks" event={"ID":"90b687e2-7ad6-4d27-ab47-a98f7127f022","Type":"ContainerDied","Data":"b3a68aec29b2440f668e19a13b21627f95fa5d8aadaa84893ef1b9ae44aeaacf"} Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.479085 4941 scope.go:117] "RemoveContainer" containerID="00510bce15ba09e29ee414f94f7e830c0d9e47813295f572bedbbbc27be953ea" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.501805 4941 scope.go:117] "RemoveContainer" containerID="c1ec15e7dee7604cafc003f3771d287d55f3d3571b227122b85390f370d0bf79" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.515862 4941 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b687e2-7ad6-4d27-ab47-a98f7127f022-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.521126 4941 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dcrks"] Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.525987 4941 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dcrks"] Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.544818 4941 scope.go:117] "RemoveContainer" containerID="bb258a2b181edc5124503d53bdb173fe5304d823b16d39a425f9e3ec3edf2556" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.569912 4941 scope.go:117] "RemoveContainer" containerID="00510bce15ba09e29ee414f94f7e830c0d9e47813295f572bedbbbc27be953ea" Mar 07 08:12:09 crc kubenswrapper[4941]: E0307 08:12:09.570286 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00510bce15ba09e29ee414f94f7e830c0d9e47813295f572bedbbbc27be953ea\": container with ID starting with 00510bce15ba09e29ee414f94f7e830c0d9e47813295f572bedbbbc27be953ea not found: ID does not exist" containerID="00510bce15ba09e29ee414f94f7e830c0d9e47813295f572bedbbbc27be953ea" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.570326 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00510bce15ba09e29ee414f94f7e830c0d9e47813295f572bedbbbc27be953ea"} err="failed to get container status \"00510bce15ba09e29ee414f94f7e830c0d9e47813295f572bedbbbc27be953ea\": rpc error: code = NotFound desc = could not find container \"00510bce15ba09e29ee414f94f7e830c0d9e47813295f572bedbbbc27be953ea\": container with ID starting with 00510bce15ba09e29ee414f94f7e830c0d9e47813295f572bedbbbc27be953ea not found: ID does not exist" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.570345 4941 scope.go:117] "RemoveContainer" containerID="c1ec15e7dee7604cafc003f3771d287d55f3d3571b227122b85390f370d0bf79" Mar 07 08:12:09 crc kubenswrapper[4941]: E0307 08:12:09.570756 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ec15e7dee7604cafc003f3771d287d55f3d3571b227122b85390f370d0bf79\": container with ID starting with c1ec15e7dee7604cafc003f3771d287d55f3d3571b227122b85390f370d0bf79 not found: ID does not exist" containerID="c1ec15e7dee7604cafc003f3771d287d55f3d3571b227122b85390f370d0bf79" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.570802 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ec15e7dee7604cafc003f3771d287d55f3d3571b227122b85390f370d0bf79"} err="failed to get container status \"c1ec15e7dee7604cafc003f3771d287d55f3d3571b227122b85390f370d0bf79\": rpc error: code = NotFound desc = could not find container \"c1ec15e7dee7604cafc003f3771d287d55f3d3571b227122b85390f370d0bf79\": container with ID starting with c1ec15e7dee7604cafc003f3771d287d55f3d3571b227122b85390f370d0bf79 not found: ID does not exist" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.570830 4941 scope.go:117] "RemoveContainer" containerID="bb258a2b181edc5124503d53bdb173fe5304d823b16d39a425f9e3ec3edf2556" Mar 07 08:12:09 crc kubenswrapper[4941]: E0307 08:12:09.571074 4941 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb258a2b181edc5124503d53bdb173fe5304d823b16d39a425f9e3ec3edf2556\": container with ID starting with bb258a2b181edc5124503d53bdb173fe5304d823b16d39a425f9e3ec3edf2556 not found: ID does not exist" containerID="bb258a2b181edc5124503d53bdb173fe5304d823b16d39a425f9e3ec3edf2556" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.571111 4941 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb258a2b181edc5124503d53bdb173fe5304d823b16d39a425f9e3ec3edf2556"} err="failed to get container status \"bb258a2b181edc5124503d53bdb173fe5304d823b16d39a425f9e3ec3edf2556\": rpc error: code = NotFound desc = could not find container \"bb258a2b181edc5124503d53bdb173fe5304d823b16d39a425f9e3ec3edf2556\": container with ID starting with bb258a2b181edc5124503d53bdb173fe5304d823b16d39a425f9e3ec3edf2556 not found: ID does not exist" Mar 07 08:12:09 crc kubenswrapper[4941]: I0307 08:12:09.966126 4941 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90b687e2-7ad6-4d27-ab47-a98f7127f022" path="/var/lib/kubelet/pods/90b687e2-7ad6-4d27-ab47-a98f7127f022/volumes" Mar 07 08:12:10 crc kubenswrapper[4941]: I0307 08:12:10.314546 4941 patch_prober.go:28] interesting pod/machine-config-daemon-knkqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 08:12:10 crc kubenswrapper[4941]: I0307 08:12:10.314622 4941 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-knkqz" podUID="250d2c0d-993b-466a-a5e0-bacae5fe8df5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 08:12:10 crc kubenswrapper[4941]: I0307 08:12:10.620084 4941 scope.go:117] "RemoveContainer" containerID="12c58124dcfa9d5c8f44e7085ba9dafd9ee1f47dfe925bd227fe6b6676a290fd"